Data Maps & Custom Collectors

Hi Guys,
I am fairly new to the development side of the collectors for Sentinel,
and have what may be a total beginners question but I've tried the docs
and feel I am getting no where with an answer so hopefully you can help
out
So I am constructing an event from within the collector, I have read
that its best practice to try to use maps where possible so I am trying
to do this. If I hard code for example e.TargetUserID = <the parsed
string that I have>, then that works, but I want to try to make use of
the Rec2Evt.map as most of the data that I am populating at this point
is listed in here.
What I have done is to add into the the following:
Code:
Collector.prototype.initialize = function() {
this.MAPS.Rec2Evt = new DataMap(this.CONFIG.collDir + "/Rec2Evt.map");
Then within parse have the following:
Code:
rec.testIP = "123.4.5.6";
rec.convert(this, instance.MAPS.Rec2Evt);
instance.SEND_EVENT = true;
return true;
Within the Rec2Evt.map file it has the default list of Sentinel Event
Fields and I have appended a record for TargetIP: TargetIP,testIP
Have I missed any obvious steps out? What I was expecting to happen was
when the event is recieved and parsed in Sentinel the TargetIP field
should have the value 123.4.5.6, when I look in either the ESM or the
Sentinel 7 webUI I dont see this field getting set, other fields which I
manually set are being set correctly.
This is the first time that I have tried to use the data maps so I
assume I am doing something wrong and any pointers you guys have would
be great,
Thanks
alanforrest
alanforrest's Profile: http://forums.novell.com/member.php?userid=90508
View this thread: http://forums.novell.com/showthread.php?t=453791

Hi Alan,
I'm not quite sure what you mean by "3 or 4 attributes", but here are
some guidelines:
Part of the Collector development process is to make a best-effort
attempt to parse out semantically distinct fields from the input and map
them to the Sentinel schema in a normalized way. Sometimes this is easy
- there's an IP address that's the target of a connection, extract it
and map it to TargetIP (TargetIP should already exist in Rec2Evt.map and
you just need to list the 'rec' attribute into which you parsed that
target IP). Sometimes this requires a little more work, for example
timestamps and whatnot that need normalization. Sometimes this is really
tricky, and you can't find a nice match to a Sentinel schema field.
Let's break this down into the following categories:
1) Simple 1:1 matches, like the IP address example above
2) 1:N matches, where you need to subparse a bit. An example might be a
path like C:\WINDOWS\system32\etc\hosts; this would map to
TargetDataName = 'hosts', TargetDataContainer = '/windows/system32',
TargetDataNamespace = 'c' (note that since Windows is case-insensitive,
everything has been lowercased and the path separators normalized - we
provide some utility flags and methods for this in the latest SDK which
will be out soon.
3) Mapped matches: in this scenario, you have a field maybe that
indicates severity using some arbitrary proprietary scale, and you need
to map this to Sentinel's 0-5 Severity. In this case it's good to use a
KeyMap, put all your possible input values in the LHC, and then map them
to Sentinel Severities in the RHC. Then you can use lookup() to look up
your input and map it to the correct output, put that output in a 'rec'
attribute, and then list that attribute in Rec2Evt.map (in this example
on the RHS after 'Severity,'
4) No schema match, doesn't need to be correlated: An example here
might be "session type", which is something that Windows provides but
that we don't (yet) have a dedicated schema field for (although we are
considering it). Let's say you want to record that information in the
event, but you don't need to correlate on that value. In that case you
can use the 'add2EI()' method to add an JSON NVP to the
ExtendedInformation field, something like 'LoginType: interactive'.
5) No schema match, need to correlate: This is the trickiest case,
where you can't find a place to put your data but you need it in a
separate field so you can correlate on it. For this scenario you can use
one of the many unallocated ReservedVarXX fields. What you need to do is
pick an unused field, add it to Rec2Evt.map, and map your data to it.
The trick is that you can't guarantee that some other Collector is not
using that field for a different purpose, so you have to be a bit more
careful when writing correlation rules etc to filter for your data
only.
In other words, the only attributes you should ever be adding to
Rec2Evt.map are ReservedVar fields. BTW, the event schema is fully
documented here:'Sentinel Event Schema'
(http://www.novell.com/developer/plug...nt_schema.html)
but note that not all fields are present in all platforms.
DCorlette
DCorlette's Profile: http://forums.novell.com/member.php?userid=4437
View this thread: http://forums.novell.com/showthread.php?t=453791

Similar Messages

  • How to extract data from custom made Idoc that is not sent

    Hi experts,
    Could you please advise if there is a way how to extract data from custom made idoc (it collects a lot of data from different SAP tables)? Please note that this idoc is not sent as target system is not fully maintained.
    As by now, we would like to verify - what data is extracted now.
    Any help, would be appreciated!

    Hi,
    The fields that are given for each segment have their length given in EDSAPPL table. How you have to map is explained in below example.
    Suppose for segment1, EDSAPPL has 3 fields so below are entries
    SEGMENT          FIELDNAME           LENGTH
    SEGMENT1         FIELD1                   4
    SEGMENT1         FIELD2                   2
    SEGMENT1         FIELD3                   2
    Data in EDID4 would be as follows
    IDOC           SEGMENT                          APPLICATION DATA
    12345         SEGMENT1                        XYZ R Y
    When you are extracting data from these tables into your internal table, mapping has to be as follows:
    FIELD1 = APPLICATIONDATA+0(4)        to read first 4 characters of this field, because the first 4 characters in this field would belong to FIELD1
    Similarly,
    FIELD2 = APPLICATIONDATA+4(2).
    FIELD3 = APPLICATIONDATA+6(2).  
    FIELD1 would have XYZ, FIELD2 = R, FIELD3 = Y
    This would remain true in all cases. So all you need to do is identify which fields you want to extract, and simply code as above to extract the data from this table.
    Hope this was helpful in explaining how to derive the data.

  • Inbound Idoc Data mapping sheet

    Hi experts,
    I need a data mapping sheet of Inbound Idoc that will help me for mapping the segments (header, Item) both for creating material, vendor, customer etc.
    That sheet will help for mapping tables and fields that are mandatory for that segments.
    Thanks & Regards.
    Ankur Garg.

    The process code you assigned to the inbound IDoc, has to be assigned to a function module which processes this IDoc, right? Now, if you use a function module of your own (CUSTOM FM!), you can do all the 'printing' you want.
    So there is no option for printing by default (standard), but you will have to do it your self.
    Another option would be to use a User exit. It seems there is a standard User exit (SIDOC001), or even a  BAdI which you can use. I haven't used it myself, so can't help you there.
    Edited by: Micky Oestreich on Mar 9, 2009 8:43 AM

  • [DME] Document about data mapping for HSBC ifile format (PP,ACH, COS)

    Dear SAP experts,
    I'm implementing an automatic payment module through HSBC in an SAP project.
    My customer want to streamline their payments to vendors using automated
    payments with HSBC.
    I have a document about ifile description which show me information about the
    structure and data format in HSBC ifile. I know how to create a DME format tree
    but the HSBC format seem to be very complicated, it contains a lots of fields and
    a lots of them I don't really understand.
    I'm sure that many of you have done this before. Do you have any document about
    data mapping for HSBC file format? I have a lot of fields leaving undone so any
    relevant documents will be very helpful for me.
    Hope you can help.
    Thank in advance.
    Maxielight.

    Hello Maxielight,
    I too am working on this IFILE format for transmitting to HSBC for Indian INR payments & am using DMEE.
    I'm struggling with a couple of fields in relation to PP.
    1st  is the 'Record Count' in FIle header, which is just a count of the total number of lines in the file.
    2nd is the 'Total number of instructions in batch' in Batch Header section, because I'm trying to count using aggreation via reference node id, but because the Batch Header is level 1 & I need to wait for '2nd Party Details for PP' which is at a lower level I always get an error when I run the check with error 'aggregation not permitted because of field I'm using is lower level, or they are not in the same segment.
    I've also tried creating a new segment with 'Delay output' but still get similiar error about the nodes being in differenet segments.
    How did you get past this issue?
    We are not using COS, instead we will print our own cheques in the office so sorry I cannot offer advise there.
    Any advice you can offer would be appreciated.
    Thanks,
    Steve

  • AIM Documentation- Data Mapping for an Upgrade

    Hi All- I am tasked to make a data mapping document for a 10.7 - 12.2 upgrade.
    I have created CV.040's in the past for conversions, but would this template hold true to a upgrade mapping? I am only mapping oracle base tables that have changed. Not custom to base or anything like that...this is strictly an upgrade, not a reimplementation.
    Has anyone done this in the past? (not 10.7 but 11-12 mappings) Is this the correct AIM template I should be using (CV.040) Does someone have a sample I could take a look at? Thanks in advance

    Hi Alan,
    Are you using [ should use ] object types to retrieve the employee details.
    Ideally you would be creating a sql object in the database and create a java object that corresponds to the sql object[ using JPublisher]. This will simplfy your persistence layer.
    am i answering ur question ?.
    have u looked at the Object types samples
    http://otn.oracle.com/sample_code/tech/java/sqlj_jdbc/files/advanced/advanced.htm
    Object Java Sample
    and
    for inhertience with object types
    http://otn.oracle.com/sample_code/tech/java/sqlj_jdbc/files/9i_jdbc/content.html
    Object Type Inheritance Support Sample
    you can always use Oracle9i JDBC driver to connect to ur 8i db.
    Elango.

  • Reference Data of Customer from R3 to CRM

    Hi Group,
        we downloaded ECC customers to CRM 5.0. now we have a situation where we want to see reference data of Customer: 1) Prev.aact no, 2) Buying Group in CRM.
       these feilds are maintained in ECC in "Company Code Data" of Customer Master under "Account management" tab.
       can somebody help me figure out which fields in CRM are these mapped to and where can i see them in CRM "BP" transaction.
            Appreciate your time. and help is awarded.

    Hi Anu,
    As Jai mentioned it is not easy to bring in additional customer master fields from R/3 to CRM during the download.
    This can acheived by extending the middleware functionality. Also you need to extend the CRM BP with these additional fields.
    First identify the fields on CRM BP in order to recive the data from R/3 fields. For this you may use some of the unused fields in the CRM BP or extend the BP using EEWB to accomodate new fields.
    Then you have to enhance the middleware by writing a custom function module to move new field data into bapimtcs structure. Then move the data from bapimtcs structure into appropriate CRM fields. Also you need to take the precaution for delta changes on these fields.
    This process is involved lot of development effort and you should be able to get some code samples from this forum's previous threads.
    <u><b>Here the note which exactly addresses your requirement : 863611</b></u>
    Reward points if it helps
    Regards,
    Paul Kondaveeti
    Message was edited by:
            Paul Kondaveeti

  • Is CS6 capable of 'Data Mapping' from an excel based database?

    I have CS6 and want to know if it is capable of data mapping from an excel based database.
    Basically i am using inDesign to create new Organisational Charts for my company. i have 23 seperate documents created, all of which conatin; 1 image box and 2 text boxes per each employee in that particular department.
    I want each document from inDesign to point at the database and pull in the correct details to each box within that document.
    I am aware of the 'place' feature but all this does is allow to you to select a range of cells and dump that in the document randomly.
    i have tried ringing adobe and they ponted me to here.
    I am a man on the edge with this... someone please help. Even if its to say 'NO' inDesign does not support this feature.
    Thanks in advance, Danny.

    Danity7 wrote:
    My companies views, much like my own (something which seems to be frowned upon in this forum 'opinions/views') when we have spent a considerable amount of money on the software, we belived the answer to using certain features or at least the answer to 'know' if the software is capable of doing a particular commend wouldnt come at an extra cost.
    Well, without more information on your data, the layout in ID, and how you expect to connect them, as Mike has already said, it's pretty hard to tell you what out of the box capabilities ID has that will work, if any, or where to point you to other commercial add-on products if they exist, or even to suggest a scripted solution, though my current reaction is that you are looking at some sort of XML workflow, which is well outside my own comfort zone.
    ID is a very complex program that has many more capabilities than what are "built-in" in the standard release versions. One of it's greatest features is the ability to expand what you can do through scripts or plugins. This forum is full of free advice, but very few participants are willing to to spend more than an hour designing a custom solution that will not be useful to anyone else unless they are compensated. The people who respond here are professionals, like you, who need to make a living. If I were paid for the time I spend here handing out free solutions and advice I wouldn't need my regular design business anymore.
    If you feel a feature should be included in the general release at no additional cost, please ask Adobe to do so: Adobe - Feature Request/Bug Report Form
    You need to understand, though, that Adobe is a big corporation that budgets the time that developers can spend on new features, and most new things that are added are features that many users have asked for, and usually over the course of several versions (long document users are still waiting for improved footnotes, unchanged since they were introduced), so unless this is a "killer" new idea that will benefit thousands of users you are unlikely to see Adobe add it any time soon, if at all.

  • Is there any Data Mapping document Between S&OP ( model that uses supply planning operator ) and SAP ECC that I will help the client in Data Mapping activity.

    Hello All,
    Is there any Data Mapping document Between S&OP ( model that uses supply planning operator ) and SAP ECC that I will help in Data Mapping activity.
    Thanks,
    Mownesh

    There are standard templates in HCI data sources.
    e.g. 1) Customer Master data template is SOP_MD_CustomerMaster for extracting master data from SAP ECC and load it to S&OP
    KNVP is the table for customer in ECC from that you can select the fields as required
    KUNNR for customer Number
    ADRNR for Address
    List of a few commonly used table names of ECC:
    Product Related:
    MARA – Material Master (MATNR)
    MARC – Material Master with Plant Data (MATNR, WERKS)
    MARD – Material Master with Storage Location Data (MATNR, LGORT, WERKS)
    MAKT – Material Master Material Descriptions (MATNR, MATKL)
    MBEW – Material Valuation Data (MATNR, BWTAR)
    MVKE – Material Master : Sales related Data
    MDKP, MDTB – MRP related Data( Header, Item)
    MCHA, MCHB – Material Batches (Header, Item) (MATNR, WERKS, LGORT, CHARG)
    Vendor/Supplier related:
    LFA1 – vendor data (LIFNR)
    LFB1 --  Company Code Segment : Vendor Data(LIFNR, BUKRS)
    LFC1 --  FI Related Vendor Data (LIFNR, BELNR)
    LFM1 – Pur. Orgn. Related Vendor Data (LIFNR, EKORG)
    PReq/PO, BOM Related:
    EBAN – Pur. Req. Data( BANFN, BNFPO, BADAT, MATNR)
    EINA – Purchase Info. Record(General Data)(INFNR, MATNR, LIFNR)
    EINE – Purchase Info. Record (pur. Orgn. Data) (INFNR, EKORG)
    ELBK, ELBN, ELBP – Vendor Evaluation Related Data
    EKKO – PO Data (Header) (EBELN, BSTYP, BSART)
    EKPO – PO Data (Item) (EBELN, EBELP, MATNR)
    Pur. Req., RFQ and PO are differentiated by Doc Type (BSTYP) in EKKO table.
    For RFQ it is ‘A’ and for PO it is ‘F’
    MKPF – GRN Data (Header) (EBELN, BLDAT, BUDAT, XBLNR, BKTXT)
    MSEG – GRN Data(Item) MBLNR, BWART, LIFNR, MATNR, EBELN)
    Apart from this there are lot of tables which begin with ‘M’ & ‘E’, but we
    use the following very often.
    EQUK – Quota (Header)(QUNUM, MATNR)
    EQUP – Quota (Item) (QUNUM, QUPOS, LIFNR)
    EKBE – PO History Data (EBELN, EBELP, BELNR, BLDAT, MATNR, VGABE)
    EKBZ – PO History with Delivery Costs(EBELN, BELNR, LIFNR, XBLNR)
    EKET – Schedule lines data of a PO(EBELN, EINDT, SLFDT)
    EKES – Vendor Confirmations Data (EBELN, EBTYP, EINDT, XBLNR)
    T163F – Confirmation Texts (EBTYP, EBTXT)
    T156 – Movement Types (BWARE)
    T024 – Purchasing Groups
    T024E – Purchase Organizations
    T163 – Item Category’s in Purchasing Documents(PSTYP)
    T149D – Valuation Types
    T134 – Material Types
    FVLK – Delivery Types
    STKO, STPO – BOM(Bill Of Material) related Data (Header & Item)
    STPU, STPN, STST, STZU – BOM Related Tables
    RKPF, RBKP, RSEG (Header & Item) – MM – FI Related Data
    KONO, KONH – Pricing data
    T006 – Basic Unit Of Measurements
    Customer/Sales Order Related:
    VBAK : Sales Document(Header Data) (VBELN)
    VBAP : Sales Document(Item Data) (VBELN, POSNR, MATNR, ARKTX, CHARG)
    Enquiry, Quotation, Sales Order are differentiated based on Doc.
    Type(VBTYP Field) in VBAK, VBAP Tables for Enquiry VBTYP = A, for Quotation ‘B’ & for Order it is ‘C’.)
    LIKP : Delivery Table(Header Data) (VBELN, LFART, KUNNR, WADAT, INCOL)
    LIPS : Delivery Table(Item Data)(VBELN, POSNR, WERKS, LGORT, MATNR, VGBEL)
    (LIPS – VBGELN = VBAK- VBELN, LIPS-VGPOS = VBAP-POSNR)
    VTTK : Shipment Table(Header Data) (TKNUM)
    VTTP : Shipment Table (Item Data)(TKNUM, TPNUM, VBELN)
    (VTTP – VBELN = LIKP – VBELN)
    VBRK : Billing Table(Header Data) (VBELN, FKART, BELNF)
    VBRP : Billing Table(Item Data) (VBELN, POSNR, FKIMG, NEWR, VGBEL, VGPOS)
    (VERP – AUBEL = VBAK- VBELN, VBRP – VBEL = LIKP – VBELN)
    Apart from these tables there are lot of other tables which starts with ‘V’, but we use the
    following tables frequently.
    VBUK: All Sales Documents status & Admn. Data(Header) (VBELN, VBTYP)
    VBTYP = ‘C’ (Sales Order) VBTYP = ‘L’(Delivery) VBTYP = ‘M’(Invoice)
    VBUP: Sales Documents status & Admin. Data(Item) (VBELN, POSNR)
    VBEP : Sales Document Schedule Lines Data (VBELN, POSNR, EDATU, WMENG)
    VBKD: To get sales related Business data like Payment terms etc.(VBELN, ZTERM)
    VBFA: Sales Document flow data(VBELV, VBELN, POSNV, VBTYP)
    VBPA: Partner functions Data(VBELN, PARVW, KUNNR, LIFNR)
    TVLKT: Delivery Type: Texts(LFART, VTEXT)
    KNA1, KNB1, KNC1 : Customer Master Data and Other Partner’s Data(KUNNR,
    NAME1,LAND1)
    KNVK: Customer Master Contact Person(PARNR, KUNNR)
    KNVV: Customer Master Sales Data.
    LFA1, LFB1, LFC1: Vendor Master Data(To get Transporter data)(LIFNR, NAME1, ORT01)
    MARA, MARC, MARD : Material Master Data(Basic, Plant, St. Location Views)
    TVKO: Sales Organizations(VKORG)
    TVKOV: Distribution Channels(VTWEG)
    TVTA: Divisions(SPART)
    TVKBZ: Sales Office(VKBUR)
    TVBVK: Sales Group(VKGRP)
    T077D: Customer Account Group(KTOKD)
    T001W: Plants(WERKS)
    T001L: Storage Locations(LGORT)
    TWLAD: To get address of Storage Location and Plant(LGORT, ADRNR)
    TVAU: Sales Document (Order) Types
    KONV: Condition Types (pricing) (KNUMV, KSCHL, KWETR)
    T685T: Condition Types Texts.
    ADRC: To get Addresses of Partners
    VBBE, VBBS: Sales Requirements Data
    VBKA: Sales Activities Data
    VBPV: Sales Document Product Proposal
    Based on the functionality you can search ECC table names and fields
    Hope this information is helpful for you.
    Thanks and Regards,
    Anjali

  • Invoice Date needs to be date when Customer receives goods

    Client when shipping by sea would like the Invoice Date to be the same day the customer received the goods, basically the Pick Up Date even though the delivery might have been 'shipped' from the jetty.
    How can this be done? Would the Invoice Date be the Pick Up date?
    Sanjib

    Hi,
    I guess you're asking about OM shipping and AR invoices. If you have invoicing at line level, as soon as you ship a line, OM insert a line into AR interface and this line is ready to be imported into AR. If you ship from USA to Europe by sea it will take about 10-15 days to goods be delivered. As invoice date is defined when you import it into AR, in order to achieve your requirement, you will not be able to import your invoice into AR and will not have info on Receivables. Or do you want to import your invoice into AR as soon as goods are shiped and change the invoice date when customer receives them?
    In the first option, you could do the following: define two AR batch sources: AIR and OTHER, for example. The first one will be used to sea shipping and the second one to ground or air shipping. In the first AR batch source, uncheck derive date. In OM, define two order types: OT_AIR and OT_OTHER, one to sea shipping and other one to ground/air shipping. In order type AIR, define AIR as batch source, and in OT_OTHER define OTHER as batch source.
    Normally import your transaction with batch source OTHER into AR. For the orders shipped by sea, you will have to import them one by one. As soon as you receive the customer acks, go to ar autoinvoice and import the order received by the customer and define as default transaction date the date informed by the customer.
    I know it is not a good solution, so I think you will need some custom to have it done.
    Hope it helps,
    Ketter Ohnes

  • For a Sales order, what is Actual Delivery Date to Customer?

    Hi Experts,
    I am an ABAP consultant. In a particular report, for a Sales order I am asked to display:
    ‘Delivery Create Date (LIKP-ERDAT)’,
    ‘PGI Date’,
    ‘Requested Delivery Date (VBAK-VDATU)’ and
    ‘Actual Delivery Date to Customer’.
    And in addition, I also have to show:
    ‘No: of days between Order Create Date to Delivery Create Date’,
    ‘No: of days between Delivery Create Date to PGI Date’,
    ‘No: of days between Requested Delivery Date to Actual Ship Date’ and
    ‘No: of days between Order Create Date to Actual Ship Date’.
    I’ve searched SCN for similar questions but I couldn’t get clarity. I’ll be very grateful if somebody can explain me how to find the ‘Actual delivery date to customer’ and what is the difference between this date and ‘Billing date’. Also, please explain, the difference between ‘Delivery Create Date’ & ‘PGI’?

    Hi Rashmith,
    It seems the report is related to delivery. Below is the explanation for the different terms you mentioned in your question.
    Delivery Create Date (LIKP-ERDAT)---- when a delivery is created with or without reference of an order system writes the date of creation time of creation and created by in the header data. Creation of delivery does not means that goods are dispatched. There are many steps further after a delivery is created before goods are dispatched.  For example I have created an order, created delivery on 01.01.2014; delivery date will always be 01.01.2014.
    PGI Date (LIKP-WADAT_IST)------ Post goods issue date is the date on which goods move out of the company to carrier. This is the last step of delivery. When delivery is created system derives different dates of planning; at this stage it will determine the planned PGI date (LIKP-WADAT) and when actual PGI  happens system writes the date in LIKP-WADAT_IST which is actual goods issue date.
    Requested delivery date (VBAK-VDATU)------When order is created for a  customer he asks for a material, a quantity and a date on which he wants the goods. This date on which customer wants the goods is called requested delivery date. Based on the requested delivery date system will check feasibility of delivering the goods on the requested delivery date based on the delivery scheduling.
    You can get the RDD based on the below logic.
    Input the delivery number in VBFA-VBELN and VBFA-VBTYP_V and get VBFA-VBELV.
    Input VBFA-VBELV in VBAK table and get the value of VDATU.
    Actual Delivery Date to Customer-----Actual delivery date is Actual post goods issue date (LIKP-WADAT_IST). This is the date on which goods are issued to the customer and customer is liable for billing for the goods dispatched.
    Difference between the Actual delivery date (Actual goods issue date) and billing date.
    Normally as per standard SAP, once goods are moved out of the company customer is liable to be for billing for the goods dispatched. So by default in standard SAP system copies the actual goods issue date (LIKP-WADAT_IST) as billing date (VBRK-FKDAT) irrespective of the date of creation of the invoice (If delivery is goods issued on 01.01.2014 and billing document/invoice is created today i.e. 14.04.2014, system by default will take 01.01.2014 as billing date). And this is the correct practice.
    However  if you want the current date as the invoice creation date instead of the actual goods issue date we can control it by copy controls feature given in SAP.
    So based on the copy controls setting, it may be different from the actual goods issue date/ actual delivery date.

  • Logical database PNP not retrieving data for custom infotypes.

    Hi all,
    I am using logical database PNP in a program. I have declared infotypes as follows:
    INFOTYPES: 0001, 0002, 0041, 9801, 9840.
    The problem is that the logical database is retrieving data for the standard infotypes but not for the custom infotypes. Any explanation as to why data for custom infotypes is not being retireved and how this can be solved will be greatly appreciated.
    regards,
    Hamza

    solved

  • Trigger badi after giving data in custom tab of po item

    Hi,
    We have added custom tab in po item level, now we want to make those fields manditory
    for specific document type.We are using badi ME_PROCESS_PO_CUST .
    The message is getting triggered initially , but after giving data in the custom fields the badi
    is not getting triggered again.And triggers if we make changes to some other fields in po.
    So can we trigger the badi after giving data in custom tab ?
    I was trying something like this.
    method IF_EX_ME_PROCESS_PO_CUST~PROCESS_ITEM .
      DATA: l_if_header TYPE REF TO if_purchase_order_mm,
            ls_header TYPE mepoheader,
           ls_mepoitem TYPE mepoitem,
      l_if_header = im_item->get_header( ).
      ls_header   = l_if_header->get_data( ).
      CALL METHOD im_item->get_data
        RECEIVING
          re_data = ls_mepoitem.
    IF ls_header-bsart = 'BW'.
    if ls_mepoitem-ZZONAME IS INITIAL.
    MESSAGE 'Plz enter Operator name in custom data' TYPE 'E'.
    endif.
    endif.
    endmethod.
    Any pointers are welcome..

    Solved using exit EXIT_SAPMM06E_012
    This exit will trigger even after giving data in custom tab at po item level.

  • Which table for the field "Net due date" of customer line items?

    Hi All,
    In which table could I find the field "Net due date" of customer line items?
    Thanks
    Gandalf

    I don't think there is a field for that.  In various SAP screens where you see this field, I think it is a calculated value (baseline date ZFBDT + days ZBD*T).

  • Function module to read data form customer tabels

    hi,
    can anyone suggest function module to read data form customer tabels?
    Please give me reply.
    TKS,
    Dharani.P

    Hi,
    this thread is hvaing your answer.
    Is there a Function module to get customer hierarchy data?
    Regards,
    Pravin

  • How to store the flat file data into custom table?

    Hi,
    Iam working on inbound interface.Can any one tell me how to store the flat file data into custom table?what is the procedure?
    Regards,
    Sujan

    Hie
    u can use function
    F4_FILENAME
    to pick the file from front-end or location.
    then use function
    WS_UPLOAD
    to upload into
    AT SELECTION-SCREEN ON VALUE-REQUEST FOR p_file.
      CALL FUNCTION 'F4_FILENAME'   "Function to pick file
        EXPORTING
          field_name = 'p_file'     "file
        IMPORTING
          file_name  = p_file.     "file
      CALL FUNCTION 'WS_UPLOAD'
       EXPORTING
         filename                       = p_file1
        TABLES
          data_tab                      = it_line
    *then loop at it_line splitting it into the fields of your custom table.
    loop at it_line.
              split itline at ',' into
              itab-name
              itab-surname.
    endloop.
    then u can insert the values into yo table from the itab work area.
    regards
    Isaac Prince

Maybe you are looking for

  • Strange FDF Symptoms

    I am having the same symptoms as others have written where my Classis ASP created FDF data won't populate a PDF form.  What I have is an enrollment process where Form A is used to send FDF data to an ASP program on the server which interacts with the

  • Cracked screen

    I have an HP 15 notebook and the screen is cracked. I am trying to see how I can connect it to an HP v75 external monitor. Please advise

  • Script for purge RAM in Mavericks

    In 10.7 Lion I made and script in Automator for apply purge command when I want to free up the RAM memory. I´ve been whatching that in Mavericks is necesary to put sudo before purge and then Terminal ask for the password. The question is: How can I m

  • Software Update does not work...

    When I open Software Update the window opens, and then the progression bar begins to fill... but then nothing. The bar fills about an inch. Yesterday I even left the house for about four or five hours, and when I got home it was the same. My last upd

  • Perfecting Hand Drawn Images

    I am trying to turn hand drawn b&w monograms into vector images.  My hand drawn images are not perfect and need to have lines straightened, curves fixed, etc.  How can I make these changes once I have performed the Live Trace function?  Is there a wa