Mars 6.0.7 Syslog Requirement for Enterasys Dragon NIDS 7.x

Apparently the MARS docs are incorrect when it comes to fashioning a syslog message from a Dragon 7.x NIDS. I formatted the message as requested but MARS keeps displaying "Unknown Device Event". The Event IDs are correct but MARS does not recognize the syslog messages as coming from the Dragon. Does anyone know what the MARS parser is expecting for an Enterasys message? As I said, I used the example in the MARS 6.x Device Configuration Guide and it did not work. One of the MARS guides actually displays what is expected for a Snort message and I was hoping there was such an example for Dragon. Thanks.

You can create a support package from the Dragon 6.x signatures provided by MARS and fashion them for 7.x. I wish I could provide the support package we created but we are not allowed to export it from the customer site. Basically here is what you do:
1. Create your own Device Type for Dragon 7.x. You can define it as an appliance or software but we opted for "appliance".
2. Modify your Dragon ESM to export syslog messages in the following format:
    %DATE% %TIME% SrcIP=%SIP% SrcPort=%SPORT% DstIP=%DIP% DstPort=%DPORT% Protocol=%PROTO% %NAME% %SENSOR%
We tested this with an NMAP scan which resulted in the following syslog message as received by MARS:
<175>alarmtool: 2010-08-11 15:12:22 SrcIP=172.16.1.1 SrcPort=0 DstIP=172.16.1.2 DstPort=0 Protocol=0 TCP-SCAN dragon-VS1
3. Create one Device Event Type using the following parse pattern:
Position    Key Pattern         Parsed Fld                       Value Type             Value Format                            Value Pattern
1              alarmtool:            Device Time                   Time                         %Y-%m-%d %H:%M:%S        \d{4}-\d{1,2}-\d{1,2} \d{1,2}:\d{1,2}:\d{1,2}
2             .+SrcIP\=             Source Address              IPV4 Dotted Quad                                                  (\d{1,3}\.){3}\d{1,3}
3            .+SrcPort\=           Source Port                    Port Number                                                          ((0x[a-fA-F\d]{1,4})|(0\d{1,6})|([1-9]\d{0,4})|0)
4            .+DstIP\=              Destination Address        IPV4 Dotted Quad                                                  (\d{1,3}\.){3}\d{1,3}
5            .+DstPort\=           Destination Port              Port Number                                                          ((0x[a-fA-F\d]{1,4})|(0\d{1,6})|([1-9]\d{0,4})|0)
6            .+Protocol\=          Protocol                         Protocol Number                                                    ((0x[a-fA-F\d]{1,2})|(0\d{1,3})|([1-9]\d{0,2})|0)
7            .+TCP-SCAN         None                             String                                                                    ([\w-]+)\-?[\w-]{3}
4. You can now export a support package that will give you the XML format needed for your new 7.x support package. The XML file will reside in the ZIP file created by the export process.
5. You will now need the Device Event Numbers and Device Event IDs used by the Dragon 6.x signatures. These can be retrieved from within MARS by browsing to the Dragon 6.x NIDS Device Events. Make sure you view ALL of the Events by selecting the "10,000" rows per page option. Now right-click on this page and select "View Source". Save this to a file (Ex. Dragon6_Events.txt).
6. You now have to extract the important data from the file created in step 5. This can be done with a few Linux grep statements and a text editor.
    a. To extract the Device Event Numbers, you can use the following grep script:
        grep '!--' Dragon6_Events.txt | grep -o -P '[0-9]{7,8}\ /?([0-9]{1,5})?' > Cisco_Dragon_Event_Numbers.txt
        NOTE: This file will be used to define the etList section of the XML file.
    b. Extract the Dragon Event IDs and numbers from the Dragon6_Events.txt file:
        grep -B2 '!--' Dragon6_Events.txt > Dragon6_Events_Stripped.txt
    c. Use grep or a text editor to remove everything from Dragon6_Events_Stripped.txt except for the Event IDs and numbers.
        When done your file should contain data in the following format:
        SPY:TOPREBATES-CONFIRM
        6503131
        ACROBAT:PDF-EXPLOIT-MALWARE
        6503132
NOTE: If a Windows text editor was used for any of the edits you will want to run "dos2unix" against the files.
7. Start creating your new support package XML file by:
    a. Open the "data_package.xml" file from the support package created in step 4.
    b. Copy the data up to the "etList" section and paste it into a new "data_package.xml" file.
    c. Use a bash script (see attached "create_etList.sh file) to read the Cisco_Dragon_Event_Numbers.txt file and export the data into a properly formatted "etList" section. Copy the etList section into the new "data_package.xml" file.
    d. Use another bash script (cannot attach it at this time) to read the Dragon6_Events_Stripped.txt file and export the data into a properly formatted "det id" section. Copy the new "det id" section into the new "data_package.xml" file.
    e. Finally, copy the lines after the "det id" section of the original data_package.xml file into the new XML file.
NOTE: This process basically creates a new data_package.xml file containing approximately 4900 device events.
8. Lastly, place the new XML file under a "dsf" directory and place it in a ZIP file. This becomes your new support package.
We successfully imported the ZIP file as a Device Support Package. The import took a while - we went home and the next morning it was successful.
Some items to note are:
1. Make sure there are NO duplicates in the etList section. This can be accomplished by importing the Cisco_Dragon_Event_Numbers.txt dat into Excel and filtering out the duplicates.
2. Make sure all of the "det id" entries have a corresponding etList entry otherwise you'll get a DSF failure when trying to import the Device Support Package.
3. To check the validity of your XML format, load your XML file in Firefox. If there are any errors, Firefox will tell you which line contains the issue. IE did not correctly tell us where errors appeared.
4. I will attach the second bash script when I can get around to re-typing it. It is basically the same script as the one attached except it echoes the lines needed to format the "det id" section. It also contains a switch to process the Event ID text then the Event Number.
Good luck!
Dave Grannas
Senior Consultant
Intelesys Corp.

Similar Messages

  • SAP PS Extractors requirement for BI + how v can do RRI for pdf doc?

    SAP PS Extractors requirement for BI + how v can extract pdf document?  
    Hi Experts
    Please anyone provide the answers for the followings:
    1)SAP PS available extractors currently.
    2)How we can extract pdf document in SAP PS to SAP BI side ... or any other possibilites is available to
    bring it in reporting?
    3)Is it compulsory to use 0PSGUID or not?
    pls advice me experts...
    Regards
    selvan
    [email protected]
    Edited by: peras selvan on Mar 16, 2008 11:39 AM

    Hi Peras,
    Pls chk this links;
    http://help.sap.com/bp_biv335/BI_EN/html/BW/ProjSystem.htm
    http://help.sap.com/saphelp_nw04/helpdata/en/b4/60073c0057f057e10000000a114084/frameset.htm
    hope this helps.
    Regards
    CSM Reddy

  • Settings required for including changes in data in a transport request.

    Hi All,
    I need the settings required for saving ths data changed in a database table into a customizing request.
    Kindly let me know the steps involved in the process.
    To elaborate:
    I have a maintenance screen for a DB table.
    Whenever a user creates/changes data to/from the DB table , using the maintenance screem, the data should be saved into a customizing TR.
    Regards-
    Harmeet Singh,

    Hi there,
    Any config change will be saved under a transport request TR. Also tables are of 2 types: Customizing table & application table.
    Changes / new entry in a customized table are transportable. You will gte a TR when the customized table values are changed / new entry.
    But application table entry / changes are non transportable. You willnot get a TR when saving the data.
    To know it, go to SE11 --> give the table name --> display --> attributes --> Delivery class. Delivery class A is application table, C is customizing table.
    Also some standard tables like MARA, VBAK are not allowed to maintain. If a table is allowed to be maintained, table maintaience allowed is checked in the above.
    Regards,
    Sivanand

  • SAP PS Extractors requirement for BI + how v can do  pdf document?

    SAP PS Extractors requirement for BI + how v can extract pdf document?  
    Posted: Mar 12, 2008 6:04 PM     Edit      E-mail this message      Reply 
    Hi Experts
    Please anyone provide the answers for the followings:
    1)SAP PS available extractors currently.
    2)How we can extract pdf document in SAP PS to SAP BI side ... or any other possibilites is available to
    bring it in reporting?
    3)Is it compulsory to use 0PSGUID or not?
    pls advice me experts...
    Regards
    selvan
    [email protected]

    Hi Peras,
    Pls chk this links;
    http://help.sap.com/bp_biv335/BI_EN/html/BW/ProjSystem.htm
    http://help.sap.com/saphelp_nw04/helpdata/en/b4/60073c0057f057e10000000a114084/frameset.htm
    hope this helps.
    Regards
    CSM Reddy

  • Requirements for swap partiion

    Hi Gurus,
    You guys are rocking!!!
    I need an urgent reply since we are having a discussion on partitioning the tables.
    I have 2 schemas , PROCESSING and REPORTING.
    REPORTING schema has 3 year worth of rolling data. PROCESSING will have just the current month data. I have some 40 tables in REPORTING schema which I am planning to partition based on Months and Region as follows
    CREATE TABLE regional_sales
                ( deptno number, item_no varchar2(20),
                  txn_date date, txn_amount number, state varchar2(2))
       PARTITION BY RANGE (txn_date)
       SUBPARTITION BY LIST (state)
       SUBPARTITION TEMPLATE
          (SUBPARTITION west VALUES ('W') TABLESPACE tbs_1,
           SUBPARTITION east VALUES ('E') TABLESPACE tbs_1,
           SUBPARTITION north VALUES ('N') TABLESPACE tbs_1      
      (PARTITION p1 VALUES LESS THAN ( trunc(TO_DATE('01-Feb-2007','DD-MON-YYYY'),'MM'))),
       PARTITION p2 VALUES LESS THAN ( trunc(TO_DATE('01-Mar-2007','DD-MON-YYYY'),'MM'))),
       PARTITION p1 VALUES LESS THAN ( trunc(TO_DATE('01-sep-2008','DD-MON-YYYY'),'MM')))
    [/code]
    Now every day in PROCESSING schema, data gets inserted for the current month say september 2008. My requirement is I need to copy data from PROCESSING for this current month to the REPORT SCHEMA daily... Basically its like delete current months data in REPORT schema and copy the computed values for the current month from PROCESSING schema . This should happen daily night.
    Is there a way of doing this easily?
    -> One of my friends told to try SWAP partition. Can I implement SWAP partitions here in this scenario? I need the latest computed records for current month in PROCESSING schema, keep it and also copy it to REPORTING schema...meaning after copying both the schemas will have latest values for current month.
    -> what are the requirements for swap partition?
    -> Or is there an elegant way of copying ?
    Thanks guys.
    Saff                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                               

    Then I take my first posting back. Partition (subpartition to be precise) exchange is the best solution:
    SQL> CREATE TABLE regional_sales(
      2                              deptno number,
      3                              item_no varchar2(20),
      4                              txn_date date,
      5                              txn_amount number,
      6                              state varchar2(2)
      7                             )
      8    PARTITION BY RANGE (txn_date)
      9      SUBPARTITION BY LIST (state)
    10        SUBPARTITION TEMPLATE(
    11                              SUBPARTITION west VALUES ('W') TABLESPACE users,
    12                              SUBPARTITION east VALUES ('E') TABLESPACE users,
    13                              SUBPARTITION north VALUES ('N') TABLESPACE users
    14                             )
    15    (
    16     PARTITION p1 VALUES LESS THAN (TO_DATE('01-Aug-2008','DD-MON-YYYY')),
    17     PARTITION p2 VALUES LESS THAN (TO_DATE('01-Sep-2008','DD-MON-YYYY')),
    18     PARTITION p3 VALUES LESS THAN (TO_DATE('01-Oct-2008','DD-MON-YYYY'))
    19    )
    20  /
    Table created.
    SQL> CREATE TABLE regional_sales_processing(
      2                                         deptno number,
      3                                         item_no varchar2(20),
      4                                         txn_date date,
      5                                         txn_amount number,
      6                                         state varchar2(2)
      7                                        )
      8    PARTITION BY LIST (state)
      9    (
    10     PARTITION west VALUES ('W') TABLESPACE example,
    11     PARTITION east VALUES ('E') TABLESPACE example,
    12     PARTITION north VALUES ('N') TABLESPACE example
    13    )
    14  /
    Table created.
    SQL> CREATE TABLE regional_sales_processing_reg(
      2                                             deptno number,
      3                                             item_no varchar2(20),
      4                                             txn_date date,
      5                                             txn_amount number,
      6                                             state varchar2(2)
      7                                            )
      8    TABLESPACE example
      9  /
    Table created.
    SQL>
    SQL> -- Daily Processing Day 1
    SQL>
    SQL> TRUNCATE TABLE regional_sales_processing
      2  /
    Table truncated.
    SQL> INSERT
      2    INTO regional_sales_processing
      3    VALUES(
      4           1,
      5           1,
      6           sysdate-1,
      7           100,
      8           'E'
      9          )
    10  /
    1 row created.
    SQL> INSERT
      2    INTO regional_sales_processing
      3    VALUES(
      4           1,
      5           1,
      6           sysdate-1,
      7           100,
      8           'W'
      9          )
    10  /
    1 row created.
    SQL> INSERT
      2    INTO regional_sales_processing
      3    VALUES(
      4           1,
      5           1,
      6           sysdate-1,
      7           100,
      8           'N'
      9          )
    10  /
    1 row created.
    SQL> COMMIT
      2  /
    Commit complete.
    SQL> SELECT  *
      2    FROM regional_sales
      3  /
    no rows selected
    SQL> TRUNCATE TABLE regional_sales_processing_reg
      2  /
    Table truncated.
    SQL> INSERT
      2    INTO regional_sales_processing_reg
      3    SELECT  *
      4      FROM  regional_sales_processing
      5      WHERE state = 'E'
      6  /
    1 row created.
    SQL> ALTER TABLE regional_sales
      2    EXCHANGE SUBPARTITION p3_east
      3      WITH TABLE regional_sales_processing_reg
      4  /
    Table altered.
    SQL> TRUNCATE TABLE regional_sales_processing_reg
      2  /
    Table truncated.
    SQL> INSERT
      2    INTO regional_sales_processing_reg
      3    SELECT  *
      4      FROM  regional_sales_processing
      5      WHERE state = 'W'
      6  /
    1 row created.
    SQL> ALTER TABLE regional_sales
      2    EXCHANGE SUBPARTITION p3_west
      3      WITH TABLE regional_sales_processing_reg
      4  /
    Table altered.
    SQL> TRUNCATE TABLE regional_sales_processing_reg
      2  /
    Table truncated.
    SQL> INSERT
      2    INTO regional_sales_processing_reg
      3    SELECT  *
      4      FROM  regional_sales_processing
      5      WHERE state = 'N'
      6  /
    1 row created.
    SQL> ALTER TABLE regional_sales
      2    EXCHANGE SUBPARTITION p3_north
      3      WITH TABLE regional_sales_processing_reg
      4  /
    Table altered.
    SQL> SELECT  *
      2    FROM regional_sales
      3  /
        DEPTNO ITEM_NO              TXN_DATE  TXN_AMOUNT ST
             1 1                    10-SEP-08        100 W
             1 1                    10-SEP-08        100 E
             1 1                    10-SEP-08        100 N
    SQL>
    SQL>
    SQL> -- Daily Processing Day 2
    SQL>
    SQL> TRUNCATE TABLE regional_sales_processing
      2  /
    Table truncated.
    SQL> INSERT
      2    INTO regional_sales_processing
      3    VALUES(
      4           1,
      5           1,
      6           sysdate-1,
      7           100,
      8           'E'
      9          )
    10  /
    1 row created.
    SQL> INSERT
      2    INTO regional_sales_processing
      3    VALUES(
      4           1,
      5           1,
      6           sysdate-1,
      7           100,
      8           'W'
      9          )
    10  /
    1 row created.
    SQL> INSERT
      2    INTO regional_sales_processing
      3    VALUES(
      4           1,
      5           1,
      6           sysdate-1,
      7           100,
      8           'N'
      9          )
    10  /
    1 row created.
    SQL> INSERT
      2    INTO regional_sales_processing
      3    VALUES(
      4           2,
      5           2,
      6           sysdate,
      7           200,
      8           'E'
      9          )
    10  /
    1 row created.
    SQL> INSERT
      2    INTO regional_sales_processing
      3    VALUES(
      4           2,
      5           2,
      6           sysdate,
      7           200,
      8           'W'
      9          )
    10  /
    1 row created.
    SQL> INSERT
      2    INTO regional_sales_processing
      3    VALUES(
      4           2,
      5           2,
      6           sysdate,
      7           200,
      8           'N'
      9          )
    10  /
    1 row created.
    SQL> COMMIT
      2  /
    Commit complete.
    SQL> SELECT  *
      2    FROM regional_sales
      3  /
        DEPTNO ITEM_NO              TXN_DATE  TXN_AMOUNT ST
             1 1                    10-SEP-08        100 W
             1 1                    10-SEP-08        100 E
             1 1                    10-SEP-08        100 N
    SQL> TRUNCATE TABLE regional_sales_processing_reg
      2  /
    Table truncated.
    SQL> INSERT
      2    INTO regional_sales_processing_reg
      3    SELECT  *
      4      FROM  regional_sales_processing
      5      WHERE state = 'E'
      6  /
    2 rows created.
    SQL> ALTER TABLE regional_sales
      2    EXCHANGE SUBPARTITION p3_east
      3      WITH TABLE regional_sales_processing_reg
      4      WITH VALIDATION
      5  /
    Table altered.
    SQL> TRUNCATE TABLE regional_sales_processing_reg
      2  /
    Table truncated.
    SQL> INSERT
      2    INTO regional_sales_processing_reg
      3    SELECT  *
      4      FROM  regional_sales_processing
      5      WHERE state = 'W'
      6  /
    2 rows created.
    SQL> ALTER TABLE regional_sales
      2    EXCHANGE SUBPARTITION p3_west
      3      WITH TABLE regional_sales_processing_reg
      4      WITH VALIDATION
      5  /
    Table altered.
    SQL> TRUNCATE TABLE regional_sales_processing_reg
      2  /
    Table truncated.
    SQL> INSERT
      2    INTO regional_sales_processing_reg
      3    SELECT  *
      4      FROM  regional_sales_processing
      5      WHERE state = 'N'
      6  /
    2 rows created.
    SQL> ALTER TABLE regional_sales
      2    EXCHANGE SUBPARTITION p3_north
      3      WITH TABLE regional_sales_processing_reg
      4      WITH VALIDATION
      5  /
    Table altered.
    SQL> SELECT  *
      2    FROM regional_sales
      3  /
        DEPTNO ITEM_NO              TXN_DATE  TXN_AMOUNT ST
             1 1                    10-SEP-08        100 W
             2 2                    11-SEP-08        200 W
             1 1                    10-SEP-08        100 E
             2 2                    11-SEP-08        200 E
             1 1                    10-SEP-08        100 N
             2 2                    11-SEP-08        200 N
    6 rows selected.
    SQL> Now, if in your case there will be local indexes involved you will need to add INCLUDING INDEXES caluse to exchange statements. Any global indexes will become unusable and you will have to rebuild them.
    SY.

  • Minimum requirements for logic pro 7

    what is the minimum requirements for logic 7,can i get away with 10.2.8 or does it have to be higher?

    mr d, a lot has happened since you moved to mars......recommend you use some of that martian dracma you've been hoarding to buy in on tiger.
    logic just loves a big fat tiger to chow on.
    seriously, d rock, where you been, man?

  • Table for temporarily stock /requirement  for tocde /afs/mdo4

    Dear expart,
    I developed a zreport for display STO number, Production order number, operation etc.
    mainly I use here AFPO,AFRU, MSEG, MCHB & J_3ABDSI Table.
    My problem is, when I compare with Tcode /afs/md04 tab-temporarily stock /requirement  .
    for some MATNR
    data show properly.
    and some MATNR are blank  with message Last MRP run on 04.04.2011 or such date.
    Hhow i can filter the in Z-report which MATNR are not in Tcode /afs/md04 tab-temporarily stock /requirement  .
    my code is.
    SELECT  j_3abdsiaufnr j_3abdsimatnr j_3abdsij_4krcat j_3abdsimbdat j_3abdsi~menge INTO TABLE it_eket FROM j_3abdsi
        FOR ALL ENTRIES IN it_final1
        WHERE
              j_3abdsi~j_4krcat = it_final1-j_4ksca AND
              j_3abdsi~matnr = it_final1-matnr AND
              j_3abdsi~werks = it_final1-werks AND
              j_3abdsi~bdart = 'TB' AND
              j_3abdsi~plart = 'B' AND
              j_3abdsi~bsart = 'UB'.
    Pls help .
    Rayhan
    Edited by: Abu Rayhan on Apr 5, 2011 10:24 AM

    CLEAR i_data1.
      REFRESH i_data1.
      LOOP AT i_mara.
        READ TABLE i_marc WITH KEY matnr = i_mara-matnr  BINARY SEARCH .
        IF sy-subrc = 0 .
          CALL FUNCTION 'J_3AM_DISPOSITION_DISPL'
            EXPORTING
              i_matnr                 = i_mara-matnr
              i_werks                 = p_werks
          I_DIALOG                = ' '
          I_SPERR                 = ' '
          I_AUFRUF                = ' '
          I_BANER                 = ' '
             i_todate                = todate
          I_HEADER_ONLY           = ' '
           IMPORTING
             ex_dbba                 = i_data3
          E_MDKP                  =
          EX_PBBD                 =
          EX_MELD                 =
          E_CM61M                 =
           EXCEPTIONS
             material_gesperrt       = 1
             wbz_fehler              = 2
             material_prgr           = 3
             dispo_gesperrt          = 4
             OTHERS                  = 5
          IF sy-subrc <> 0.
    MESSAGE ID SY-MSGID TYPE SY-MSGTY NUMBER SY-MSGNO
            WITH SY-MSGV1 SY-MSGV2 SY-MSGV3 SY-MSGV4.
          ELSE.
            IF i_data3[] IS NOT INITIAL.
              LOOP AT i_data3 INTO i_data4 .
                  IF ( i_data4-j_3astat ='A' OR i_data4-j_3astat ='T') AND i_data4-j_3abskz ='C'   .
                    READ TABLE i_t001l WITH KEY lgort = i_data4-lgonr  BINARY SEARCH .
                    IF sy-subrc = 0 .
                      CLEAR i_data1str.
                      i_data1str-matnr = i_data4-matnr.
                      i_data1str-j_3asize = i_data4-j_3asize .
                      i_data1str-lgort = i_data4-lgonr.
                      i_data1str-menge = i_data4-menge .
                      COLLECT i_data1str INTO i_data1.
                    ENDIF.
                  ENDIF.
              ENDLOOP.
            ENDIF.
          ENDIF.
      ENDLOOP.
    Questions
    i_mara  recordset 500 material
    It take more than 3 house to finished this report.
    do changed ?
    do you help me ?
    Think.

  • What are the settings required for QM in procurement

    Hi Team,
    What are the settings required for QM in procurement. I have  set indicator for QM in procurement in QM view in material master.
    I am not clear about  following fields to be maintained in QM view.
    QM Control Key
    Certificate type
    Target QM system
    Tech. delivery terms Indicator.
    Please suggest me in which case to be used this fiels. Is it relivant to Quality Certificates.
    Thanks

    Hi,
    All meaning are
    QM Control Key :
    If you activate the indicator for QM in procurement in the material master record at the client level, you must also store a control key at the plant level for quality management in procurement.
    Certificate type :
    Certificate types applies to the Certificate processing in procurement  &  Certificate creation
    Target QM system :
    whether the vendor's verified QM system, according to vendor master record or quality info-record (for a combination of vendor/material) meets the requirements for QM systems as specified in the material master
    -  If you activate the indicator for QM in procurement in the material master record at the client level, you must also store a control key at the plant level for quality management in procurement. If you want Procurment control then accordingly define Control Key.
    -  If you want Vendor's perticular certificate for Material then you have to define Certificate type.
    Also, you have to maintain Material, Vendor's Info record at plant level.
    Thanks,
    JM

  • List of Manual Setup required for iSetup to work

    Hi All,
    This is Mugunthan from iSetup development. Based on my interaction with customers and Oracle functional experts, I had documented list of manual setups that are required for smooth loading of selection sets. I am sharing the same. Please let me know if I anyone had to enter some manual setup while using iSetup.
    Understanding iSetup
    iSetup is a tool to migrate and report on your configuration data. Various engineering teams from Oracle develop the APIs/Programs, which migrates the data across EBS instances. Hence all your data is validated for all business cases and data consistency is guarantied. It requires good amount of setup functional knowledge and bit of technical knowledge to use this tool.
    Prerequisite setup for Instance Mapping to work
    ·     ATG patch set level should be same across all EBS instances.
    ·     Copy DBC files of each other EBS instances participating in migration under $FND_SECURE directory (refer note below for details).
    ·     Edit sqlnet.ora to allow connection between DB instacnes(tcp.invited_nodes=(<source>,<central>))
    ·     Make sure that same user name with iSetup responsibility exists in all EBS instances participating in migration.
    Note:- iSetup tool is capable of connecting to multiple EBS instances. To do so, it uses dbc file information available under $FND_SECURE directory. Let us consider three instances A, B & C, where A is central instance, B is source instance and C is target instances. After copying the dbc file on all nodes, $FND_SECURE directory would look like this on each machine.
    A => A.dbc, B.dbc, C.dbc
    B => A.dbc, B.dbc
    C => A.dbc, C.dbc
    Prerequisite for registering Interface and creating Custom Selection Set
    iSetup super role is mandatory to register and create custom selection set. It is not sufficient if you register API on central/source instance alone. You must register the API on all instances participating in migration/reporting.
    Understanding how to access/share extracts across instances
    Sharing iSetup artifacts
    ·     Only the exact same user can access extracts, transforms, or reports across different instances.
    ·     The “Download” capability offers a way to share extracts, transforms, and loads.
    Implications for Extract/Load Management
    ·     Option 1: Same owner across all instances
    ·     Option 2: Same owner in Dev, Test, UAT, etc – but not Production
    o     Extract/Load operations in non-Production instances
    o     Once thoroughly tested and ready to load into Production, download to desktop and upload into Production
    ·     Option 3: Download and upload into each instance
    Security Considerations
    ·     iSetup does not use SSH to connect between instances. It uses Concurrent Manager framework to lunch concurrent programs on source and target instances.
    ·     iSetup does not write password to any files or tables.
    ·     It uses JDBC connectivity obtained through standard AOL security layer
    Common Incorrect Setups
    ·     Failure to complete/verify all of the steps in “Mapping instances”
    ·     DBC file should be copied again if EBS instance has been refreshed or autoconfig is run.
    ·     Custom interfaces should be registered in all EBS instances. Registering it on Central/Source is not sufficient.
    ·     Standard Concurrent Manager should up for picking up iSetup concurrent requests.
    ·     iSetup financial and SCM modules are supported from 12.0.4 onwards.
    ·     iSetup is not certified on RAC. However, you may still work with iSetup if you could copy the DBC file on all nodes with the same name as it had been registered through Instance Mapping screen.
    Installed Languages
    iSetup has limitations where it cannot Load or Report if the number and type of installed languages and DB Charset are different between Central, Source and Target instances. If your case is so, there is a workaround. Download the extract zip file to desktop and unzip it. Edit AZ_Prevalidator_1.xml to match your target instance language and DB Charset. Zip it back and upload to iSetup repository. Now, you would be able to load to target instance. You must ensure that this would not corrupt data in DB. This is considered as customization and any data issue coming out this modification is not supported.
    Custom Applications
    Application data is the prerequisite for the most of the Application Object Library setups such as Menus, Responsibility, and Concurrent programs. iSetup does not migrate Custom Applications as of now. So, if you have created any custom application on source instance, please manually create them on the target instance before moving Application Object Library (AOL) data.
    General Foundation Selection Set
    Setup objects in General foundation selection set supports filtering i.e. ability to extract specific setups. Since most of the AOL setup data such as Menus, Responsibilities and Request Groups are shipped by Oracle itself, it does not make sense to migrate all of them to target instance since they would be available on target instance. Hence, it is strongly recommended to extract only those setup objects, which are edited/added, by you to target instance. This improves the performance. iSetup uses FNDLOAD (seed data loader) to migrate most of the AOL Setups. The default behavior of FNDLOAD is given below.
    Case 1 – Shipped by Oracle (Seed Data)
    FNDLOAD checks last_update_date and last_updated_by columns to update a record. If it is shipped by Oracle, the default owner of the record would be Oracle and it would skip these records, which are identical. So, it won’t change last_update_by or last_updated_date columns.
    Case 2 – Shipped by Oracle and customized by you
    If a record were customized in source instance, then it would update the record based on last_update_date column. If the last_update_date in the target were more recent, then FNDLOAD would not update the record. So, it won’t change last_update_by column. Otherwise, it would update the records with user who customized the records in source instance.
    Case 3 – Created and maintained by customers
    If a record were newly added/edited in source instance by you, then it would update the record based on last_update_date column. If the last_update_date of the record in the target were more recent, then FNDLOAD would not update the record. So, it won’t change last_update_by column. Otherwise, it would update the records with user who customized the records in source instance.
    Profiles
    HR: Business Group => Set the name of the Business Group for which you would like to extract data from source instance. After loading Business Group onto the target instance, make sure that this profile option is set appropriately.
    HR: Security Profile => Set the name of the Business Group for which you would like to extract data from source instance. After loading Business Group onto the target instance, make sure that this profile option is set appropriately.
    MO: Operating Unit => Set the Operating Unit name for which you would like to extract data from source instance. After loading Operating Unit onto the target instance, make sure that this profile option is set if required.
    Navigation path to do the above setup:
    System Administrator -> Profile -> System.
    Query for the above profiles and set the values accordingly.
    Descriptive & Key Flex Fields
    You must compile and freeze the flex field values before extracting using iSetup.
    Otherwise, it would result in partial migration of data. Please verify that all the data been extracted by reporting on your extract before loading to ensure data consistency.
    You can load the KFF/DFF data to target instance even the structures in both source as well as target instances are different only in the below cases.
    Case 1:
    Source => Loc1 (Mandate), Loc2 (Mandate), Loc3, and Loc4
    Target=> Loc1, Loc2, Loc3 (Mandate), Loc4, Loc5 and Loc6
    If you provide values for Loc1 (Mandate), Loc2 (Mandate), Loc3, Loc4, then locations will be loaded to target instance without any issue. If you do not provide value for Loc3, then API will fail, as Loc3 is a mandatory field.
    Case 2:
    Source => Loc1 (Mandate), Loc2 (Mandate), Loc3, and Loc4
    Target=> Loc1 (Mandate), Loc2
    If you provide values for Loc1 (Mandate), Loc2 (Mandate), Loc3 and Loc4 and load data to target instance, API will fail as Loc3 and Loc4 are not there in target instance.
    It is always recommended that KFF/DFF structure should be same for both source as well as target instances.
    Concurrent Programs and Request Groups
    Concurrent program API migrates the program definition(Definition + Parameters + Executable) only. It does not migrate physical executable files under APPL_TOP. Please use custom solution to migrate executable files. Load Concurrent Programs prior to loading Request Groups. Otherwise, associated concurrent program meta-data will not be moved even through the Request Group extract contains associated Concurrent Program definition.
    Locations - Geographies
    If you have any custom Geographies, iSetup does not have any API to migrate this setup. Enter them manually before loading Locations API.
    Currencies Types
    iSetup does not have API to migrate Currency types. Enter them manually on target instance after loading Currency API.
    GL Fiscal Super user--> setup--> Currencies --> rates -- > types
    Associating an Employee details to an User
    The extract process does not capture employee details associated with users. So, after loading the employee data successfully on the target instance, you have to configure them again on target instance.
    Accounting Setup
    Make sure that all Accounting Setups that you wish to migrate are in status “Complete”. In progress or not-completed Accounting Setups would not be migrated successfully.
    Note: Currently iSetup does not migrate Sub-Ledger Accounting methods (SLA). Oracle supports some default SLA methods such as Standard Accrual and Standard Cash. You may make use of these two. If you want to use your own SLA method then you need to manually create it on target instances because iSetup does not have API to migrate SLA. If a Primary Ledger associated with Secondary Ledgers using different Chart of Accounts, then mapping rules should be defined in the target instance manually. Mapping rule name should match with XML tag “SlCoaMappingName”. After that you would be able to load Accounting Setup to target instance.
    Organization API - Product Foundation Selection Set
    All Organizations which are defined in HR module will be extracted by this API. This API will not extract Inventory Organization, Business Group. To migrate Inventory Organization, you have to use Inventory Organization API under Discrete Mfg. and Distribution Selection Set. To extract Business Group, you should use Business Group API.
    Inventory Organization API - Discrete Mfg & Distribution Selection Set
    Inventory Organization API will extract Inventory Organization information only. You should use Inventory Parameters API to move parameters such as Accounting Information. Inventory Organization API Supports Update which means that you can update existing header level attributes of Inventory Organization on the target instance. Inventory Parameters API does not support update. To update Inventory Parameters, use Inventory Parameters Update API.
    We have a known issue where Inventory Organization API migrates non process enabled organization only. If your inventory organization is process enabled, then you can migrate them by a simple workaround. Download the extract zip file to desktop and unzip it. Navigate to Organization XML and edit the XML tag <ProcessEnabledFlag>Y</ProcessEnabledFlag> to <ProcessEnabledFlag>N</ProcessEnabledFlag>. Zip it back the extract and upload to target instance. You can load the extract now. After successful completion of load, you can manually enable the flag through Form UI. We are working on this issue and update you once patch is released to metalink.
    Freight Carriers API - Product Foundation Selection Set
    Freight Carriers API in Product Foundation selection set requires Inventory Organization and Organization Parameters as prerequisite setup. These two APIs are available under Discrete Mfg. and Distribution Selection Set. Also,Freight Carriers API is available under Discrete Mfg and Distribution Selection Set with name Carriers, Methods, Carrier-ModeServ,Carrier-Org. So, use Discrete Mfg selection set to load Freight Carriers. In next rollup release Freight Carriers API would be removed from Product Foundation Selection Set.
    Organization Structure Selection Set
    It is highly recommended to set filter and extract and load data related to one Business Group at a time. For example, setup objects such as Locations, Legal Entities,Operating Units,Organizations and Organization Structure Versions support filter by Business Group. So, set the filter for a specific Business Group and then extract and load the data to target instance.
    List of mandatory iSetup Fwk patches*
    8352532:R12.AZ.A - 1OFF:12.0.6: Ignore invalid Java identifier or Unicode identifier characters from the extracted data
    8424285:R12.AZ.A - 1OFF:12.0.6:Framework Support to validate records from details to master during load
    7608712:R12.AZ.A - 1OFF:12.0.4:ISETUP DOES NOT MIGRATE SYSTEM PROFILE VALUES
    List of mandatory API/functional patches*
    8441573:R12.FND.A - 1OFF:12.0.4: FNDLOAD DOWNLOAD COMMAND IS INSERTING EXTRA SPACE AFTER A NEWLINE CHARACTER
    7413966:R12.PER.A - MIGRATION ISSUES
    8445446:R12.GL.A - Consolidated Patch for iSetup Fixes
    7502698:R12.GL.A - Not able to Load Accounting Setup API Data to target instance.
    Appendix_
    How to read logs
    ·     Logs are very important to diagnose and troubleshoot iSetup issues. Logs contain both functional and technical errors.
    ·     To find the log, navigate to View Detail screens of Extracts/ Transforms/Loads/Standard/Comparison Reports and click on View Log button to view the log.
    ·     Generic Loader (FNDLOAD or Seed data loader) logs are not printed as a part of main log. To view actual log, you have to take the request_id specified in the concurrent log and search for the same in Forms Request Search Window in the instance where the request was launched.
    ·     Functional errors are mainly due to
    o     Missing prerequisite data – You did not load one more perquisite API before loading the current API. Example, trying to load “Accounting Setup” without loading “Chart of Accounts” would result in this kind of error.
    o     Business validation failure – Setup is incorrect as per business rule. Example, Start data cannot be greater than end date.
    o     API does not support Update Records – If the there is a matching record in the target instance and If the API does not support update, then you would get this kind of errors.
    o     You unselected Update Records while launching load - If the there is a matching record in the target instance and If you do not select Update Records, then you would get this kind of errors.
    Example – business validation failure
    o     VONAME = Branches PLSQL; KEY = BANKNAME = 'AIBC‘
    o     BRANCHNAME = 'AIBC'
    o     EXCEPTION = Please provide a unique combination of bank number, bank branch number, and country combination. The 020, 26042, KA combination already exists.
    Example – business validation failure
    o     Tokens: VONAME = Banks PLSQL
    o     BANKNAME = 'OLD_ROYAL BANK OF MY INDIA'
    o     EXCEPTION = End date cannot be earlier than the start date
    Example – missing prerequisite data.
    o     VONAME = Operating Unit; KEY = Name = 'CAN OU'
    o     Group Name = 'Setup Business Group'
    o     ; EXCEPTION = Message not found. Application: PER, Message Name: HR_ORG_SOB_NOT_FOUND (Set of books not found for ‘Setup Business Group’)
    Example – technical or fwk error
    o     OAException: System Error: Procedure at Step 40
    o     Cause: The procedure has created an error at Step 40.
    o     Action: Contact your system administrator quoting the procedure and Step 40.
    Example – technical or fwk error
    o     Number of installed languages on source and target does not match.
    Edited by: Mugunthan on Apr 24, 2009 2:45 PM
    Edited by: Mugunthan on Apr 29, 2009 10:31 AM
    Edited by: Mugunthan on Apr 30, 2009 10:15 AM
    Edited by: Mugunthan on Apr 30, 2009 1:22 PM
    Edited by: Mugunthan on Apr 30, 2009 1:28 PM
    Edited by: Mugunthan on May 13, 2009 1:01 PM

    Mugunthan
    Yes we have applied 11i.AZ.H.2. I am getting several errors still that we trying to resolve
    One of them is
    ===========>>>
    Uploading snapshot to central instance failed, with 3 different messages
    Error: An invalid status '-1' was passed to fnd_concurrent.set_completion_status. The valid statuses are: 'NORMAL', 'WARNING', 'ERROR'FND     at oracle.apps.az.r12.util.XmlTransmorpher.<init>(XmlTransmorpher.java:301)
         at oracle.apps.az.r12.extractor.cpserver.APIExtractor.insertGenericSelectionSet(APIExtractor.java:231)
    please assist.
    regards
    girish

  • Questions on SETSPN syntax and what is required for MANUAL AD auth

    I'll preface this by stating that I don't need to do all the extra stuff for Vintela SSO, SSO to database, etc.  I just need to know precisely what is necessary to do to get AD authentication working.  I managed to get it working in XIr2 previously but it's been so long and I'm not 100% sure that everything I wound up doing was absolutely necessary that I wanted to sort it out for good as we look at going to XI 3.1 SP3.
    In the XI 3.1 SP3 admin guide, page 503, the SETSPN command which is
    used as part of the setup process to establish a service account to
    enable AD authentication is outlined as follows:
    SETSPN.exe -A <ServiceClass>/<DomainName> <Serviceaccount>
    The guide suggests that the <ServiceClass> can be anything you want to
    arbitrarily assign. If I choose something other than the
    suggested "BOBJCentralMS" value, is there anywhere else I have to
    specify this value to allow the service account to function properly?
    The guide suggests that the <DomainName> should be the domain name on
    which the service account exists however I've seen many posts online which seem to
    indicate this <DomainName> should actually be the FQDN of the server
    running the CMS service instead of the general domain name.
    Clarification there would be very helpful if anyone has some insight.

    The CMS account can have an SPN of spaghetti/meatballs, there are no requirements (cept 2 characters on each side of the / I believe). The SPN created should be the value entered in the CMC > Authentication > Windows AD
    The account must run the SIA and it therefore must have AD permissions. Now if you are using IIs or client tools you don't even need an SPN. The SPN is for kerberos only which is required for java app servers.
    The vintela SSO white paper in the this forums sticky post explains the roles of a service account.
    Regards,
    Tim

  • Is CAL required for SharePoint Foundation 2010?

    Is extra CAL required for uploading/downloading docs using SharePoint Foundation 2010? We already have Licensed Windows Server 2008 R2. Ours is an intranet application which will be accessed by 400 intranet users.
    If CAL is required then can i use only 1 CAL for my app server to upload/download docs on SP Foundation provided all 400 client requests will go via app server to SP server?

    If I want to use sharepoint foundation 2013, I need to have Windows server 2008 R2 / 2012 license & SQL server license (optional). I'm confused in - what kind of windows server license is required & do I also need some other license to use sharepoint
    foundation?
    How can I provide external user access to my sharepoint environment? Is there any requirement of any other license also?

  • Requirements for Cisco Jabber Applications

    Hi,
         I have to setup a lab enviornment for demonstrationg the Jabber technologies for our client.i am really new to jabber technologies and as per my understanding its a Softphone that we can install in PC and Mobile and has the features such as  IM,Presence and Web-ex.
    For my demonstration I am looking to Install the softphone in a windows PC.
    My doubt is what all software combination required for Jabber technologies??
    1)Either CUCM+Presence+Jabber(What Versons??)
    2)CUCM+Jabber.(Versions??)
    We have 3 MCS in our lab and running CUCM version 7.0.If we need a later version for this to support ??
    I serached in google for good document but no luck.....Please give me a good documents for integeration and configuration for this requiremnets.
    Thanks in advance.
    Nithin louis.

    Hi Nitin,
                 May be you can follow these links to get started:
    https://supportforums.cisco.com/docs/DOC-23292
    http://www.cisco.com/en/US/docs/voice_ip_comm/jabber/Android/8_6/JABA_BK_A940B90D_00_jabber-for-android-admin-guide_chapter_00.html
    http://www.cisco.com/en/US/partner/docs/voice_ip_comm/jabber/Android/8_6/b_Cisco_Mobile_for_Android.html
    http://www.cisco.com/en/US/docs/voice_ip_comm/jabber/Android/8_6/JABA_BK_A940B90D_00_jabber-for-android-admin-guide_chapter_01.html#CJAB_TK_D698EA8E_00
    http://www.cisco.com/en/US/docs/voice_ip_comm/jabber/Android/8_6/JABA_BK_A940B90D_00_jabber-for-android-admin-guide_chapter_010.pdf
    Hope this helps.
    Please rate helpful posts!!!
    regards,
    Abdul

  • On installation of Firefox 4 beta, it tells me that it won't run on my Mac. When I go to system requirements, it takes me to a page for Firefox3.6 requirements. Where can I find the requirements for the beta?

    The link to system requirements for Firefox 4 beta takes you to a page that gives you the requirements for Firefox 3.6 instead. I assume that the requirements for the beta are different as I can't seem to install the beta on my Macs. Where can I find the beta system requirements?

    I never expected Mozilla to neglect the users of Apple's PowerPC (PPC) computers, while still supporting much older operating environments such as Windows XP. So, I should just throw away a $6,000 system, which is still fast and functional, and contribute masses of heavy metals to land-fills because…?
    I'm hurt and confused after supporting Firefox for many years — installing and using it not only on my own computers but on most of my customers' computers. Now I have to tell my customers with PPC Macs they can't have a secure cross-platform browsing experience from Mozilla. They'll have to learn to use some other browser, or buy a new computer.
    If one doesn't buy a new computer every year, one doesn't deserve Firefox 4. This is sort of sounding more like a Microsoft™© doctrine. (But wait, I can still have Firefox 4 on a ''SERIOUSLY CRAPPY WINDBLOWS XP pile of rubbish?!'') Please open your eyes Mozilla. This just doesn't make sense.

  • I am trying to install iTunes on my PC, but I get this error: "There is a problem with this Windows Installer package. A DLL required for this install to complete could not be run. Contact your support personnel or package vendor." Help!

    I am trying to install iTunes on my PC (using Windows 8.1), but I get this error: "There is a problem with this Windows Installer package. A DLL required for this install to complete could not be run. Contact your support personnel or package vendor." The iTunes file (64-bit) I am trying to install, is named "iTunes64Setup.exe". What seems to be the problem? Help!

    Hey madnest,
    Thanks for the question. After reviewing your post, it sounds like you are having difficulty installing iTunes in Windows. I would recommend that you read this article, it may be able to help you resolve or isolate the issue.
    Issues installing iTunes or QuickTime for Windows
    Thanks for using Apple Support Communities.
    Have a nice day,
    Mario

  • Open Purchase Orders not considered as requirement for MRP Run

    Hi ,
    We are facing a issue of Open Purchase orders not appeared in Stock requirements list and also it is not considered as requirement for MRP Run against reservation.
    As aresult for a reservation demand of 10 units we are ending with Open POs 10 units and a additional planned order 10 units.
    Material Type : ERSA
    MRP Type : PD (or VB)
    Lot size : EX (  HB if MRP Type is VB)
    Could you please throw some light to correct our settings to MRP to consider Open POs?
    Saravanan

    Can you check to see if there is a re-order point set up for this materia? That could be causing the problem too.

Maybe you are looking for

  • Is it possible to install Final Cut Studio 2 with a MacBook 2.1 GHz 13.3in?

    I just purchased a new MacBook and Final Cut Studio, but my computer won't let me install FCP 6. I think it is a graphics card problem. Is there something I can purchase or download to make it compatible, or do I need to replace computer or use an ol

  • Restoring ipods

    hello, I want to restore my ipod, but the computer won't recognize it. i use the firewire cable that came with the ipod, and i can use the cable to charge my ipod, but it still won't connect. can anyone tell me how to restore my ipod 4th gen? any hel

  • Json parsing error in J2ME

    Problem in J2ME application while using Json Object in Midlet file JSONObject obj=new JSONObject(); .Json-lib2.4-jdk15.jar added in project libraries but application shows error as java.lang.NoClassDefFoundError: net/sf/json/JSONObject. Project creat

  • Setting frame size to 640 by 360

    Hello I am trying to edit video that is 640 by 360 in size but Premiere seems to only want to have it in 720 by 480. This results in a thick black box around my video that is very annoying. Is there any way to fix this? I do not want to upscale the v

  • Photoshop cs 6 failed to install

    when i try to install cs6 i get the error: "installer failed to initialize. this could be due to a missing file." trying to install it on windows 7. it went all the way thru the first time but it froze up and i had to shut it down from the processes