Processing the batches of data in parallel

Hi friends,
Can any one suggest me how to process the bathces of data in parallel.
Please see the requirement below.
•     A new version of ZREDISND1_OLD named ZREDISND1_EINVOICE with the possibility of processing the blocked invoices with parallel processing capability. The program will process the blocked invoices and will have the same results / outcome as the old program ZREDISND1_OLD had.
•     A new version of Z_RECTHI01_60 named ZRECTHI01_EINVOICE with the possibility of charging the amounts with parallel processing capability. The program will have the same results / outcome as the old program Z_RECTHI01_60 had.
•     An umbrella-program ZREDISND1_UMBRELLA will have de same capability as ZREDISND1_EINVOICE but will split the data automatically into batches, initial by supplier and additional by Business Partner followed by Contract Account. It will therefore act as an umbrella program for ZREDISND1_EINVOICE.
•     An umbrella-program ZRECTHI01_UMBRELLA will have the same capability as ZRECTHI01_EINVOICE but will split the data automatically into batches by suppliers. It will therefore act as an umbrella program for ZRECTHI01_EINVOICE.
•     For the realisation of the umbrella-programs a new table must be created. The table must contain all the service providers which are charged by E-Facturering. Also the indication if the blocked invoices belong to the service provider are processed indirect (the old method) or directly (the new method) must be stored. The table name will be ZEINVOICE.
I have to develop the umbrella programs.
Please suggest to me how implement this. Its an Urgent
Thanks in advance.
Vijay

We have 8 "short-fat" tables to hold the source data uploaded from the source file via SQL*Loader (the SQL*Loader step is fast). The data consists simply of strings of characters, which we treat simply as VARCHAR2 for the most part.
These tables consist essentially of a case key (composite key initially) plus up to 250 data columns. 8*250 = 2000, so we can handle up to 2000 of these variable values. The source data may have 100 any number of variable values in each record, but each record in a given file has the same structure. Each file-load event may have a different set of variables in different locations, so we have to map the short-fat columns COL001 etc to the corresponding variable definition (for validation etc) at runtime.
CASE_ID VARCHAR2(13)
COL001 VARCHAR2(10)
COL250     VARCHAR2(10)
We do a bit of initial validation in the short-fat tables, setting a surrogate key for each case etc (this is fast), then we pivot+validate this short-fat data column-by-column into a "long-thin" intermediate table, as this is the target format and we need to store the validation results anyway.
The intermediate table looks similar to this:
CASE_NUM_ID NUMBER(10) -- surrogate key to identify the parent case more easily
VARIABLE_ID NUMBER(10) -- PK of variable definition used for validation and in target table
VARIABLE_VALUE VARCHAR2(10) -- from COL001 etc
STATUS VARCHAR2(10) -- set during the pivot+validate process above
The target table looks very similar, but holds cumulative data for many weeks etc:
CASE_NUM_ID NUMBER(10) -- surrogate key to identify the parent case more easily
VARIABLE_ID NUMBER(10) -- PK of variable definition used for validation and in target table
VARIABLE_VALUE VARCHAR2(10)
We only ever load valid data into the target table.
Chris

Similar Messages

  • Where is the batch expiration date stored?

    I created a GR and received in materials of a certain batch, assigning an expiration date. When I go into MSC3N I can see the batch with the expiration date that I assigned; however when I go into the batch table (MCHA) the field VFDAT (Shelf Life Expiration or Best-Before Date) is blank.  Where is this data being stored?  It must be somewhere because it's pulling into the program to view batches.
    Thanks.

    check in table MCH1.
    If data is stored in MCH1 or MCHA is dependend on your batch level.

  • Post processing the received serial data

    Hello,
    I want to process the data received by comport in labview where the transmitter device sends 22 bytes of data in this format 0x01 0x02 <18 bytes of information data with LSB first> 0x03 0x0D through the comport which should be read using Labview. I want to process the received 22 bytes in labview as below
    1)      Leave the first 2 bytes and last two bytes
    2)      From the left 18bytes each consecutive set of 3 bytes should be reversed (because the transmitter sends the LSB first) and stored in a text file.
    Can anyone suggest me the steps to implement this.
    Thanks.
    Solved!
    Go to Solution.

    PatanGova wrote:
    When first 2bytes are skipped and remaining 18 bytes are divided such that consecuitve 3 bytes are rearranged then it is considering 030D0102(which is rearranged as in spreadsheet2 of the attached image) which should not be.
    It is ignoring the first two bytes.  That's what the String Subset is doing.  You are taking starting on the thrid byte and up to 18 bytes.  If you put your Recieved Data on the output of the VISA Read, you would see that.
    PatanGova wrote:
    But I want to 1)skip the first two bytes(0x01 0x02) and last two bytes(0x03 0x0D).
    2)From the left 18bytes each consecutive set of 3 bytes should be reversed (because the transmitter sends the LSB first)
    Your code is already doing that.
    PatanGova wrote:
    3) arranged 3bytes value should be converted into 2's complement and want to plot the data (2's complement of arranged 3bytes) continuously such that there will 6 continous plots.(each consisting 2's complement of the rearranged 3bytes)
    Now that is a little more difficult.  A three byte Two's Compliment will take a little work.
    There are only two ways to tell somebody thanks: Kudos and Marked Solutions
    Unofficial Forum Rules and Guidelines

  • I'm new user. I like to acquire, save and process the dynamic signal data

    My signal is from 5-10KHz and I'm using NI4551. I would like to acuire data for 5-10 sec, save it on the file and then perform frequency spectrum analysis. I realy appreciate if somebody refers me a vi, that I can modify to perform this particular task.
    Also I know there are good examples in Labview 6.1 but I don't know which would be appropriate for my application, and how to integrate separate vi's and achive my task.
    Thanks for the help

    In this case, you will not be able to use NIDSA.
    In order to program the 4451, you need to use the NIDAQ functions that are available in LabVIEW.
    I attach an example that shows how to save data acquired on a single channel to disk.
    Things you may want to add are :
    - Gain (input limits) depending on your sensor
    - AC/DC coupling
    I also compute a power spectrum on the whole waveform.
    You probably want to analyze your data in smaller blocks or analyze transients and modifying this example should be very easy.
    Please note I've been using the "waveform data type" to save data to a file in binary format.
    If you want your data to be "readable" by humans (ie save in XLS file format), you will have to use the "Export waveform to
    spreadsheet file" available in the waveforms manipulation palette
    I also would like to point you to the "Sound and Vibration 2.0 toolset" user manual :
    http://digital.ni.com/manuals.nsf/websearch/AC7BC1​9618C720BD86256BB2005AB842?OpenDocument&node=13210​0_US
    You may find useful information about signal processing in this manual.
    Hope this helps !
    Gerald
    Attachments:
    4451_Save_and_Process.vi ‏75 KB

  • Process order release date as batch manufacturing date

    Hi All,
    We have automatic batch creation during the release of the process order. Now the requirement is that proess order release date should be updated as Batch manufacturing date in the corresponding Batch data (MSC3N). How this can be done.
    Regards
    Vinamrath

    Hi Vinamrath,
    Hope your proess order release date and the batch creation date are same.
    In such case go to txn CT04 (create Characteristic) and create a  Characteristic say  "Batch manufacturing date". Then in the tab Addnl data maintain the table name as "MCH1 or MCHA" and Field name as "ERSDA" (created on-this is the batch creation date).
    In case your  proess order release date and the batch creation date are not same, then there is a field called " HSDAT" -date of manufacture.you can maintain this field name in addnl data.
    Maintain this Characteristic in classification (CL02)
    Then this will get updated in the batch and same can be viewed in MSC3N
    Regards
    Hari
    Edited by: Harikris_83 on Sep 29, 2011 3:05 PM

  • How to process the data in a child node

    Hi Guru,
    would like to knows is that a way or best practice to process the child notes data? This is my case.
    request input will have this structure.
    <To>
    <Email>[email protected]</Email>
    <Email>[email protected]</Email>
    </To>
    <From>David</From>
    from the above schema i will need to use the email add to invoke some DB process and get the user details. Any idea how can I get each of the nodes and process it. Sorry i am still new in XML.
    In my mind is to loop the TO element but how? Thanks in advance

    something like that :
    <while name="While_1" condition="bpws:getVariableData('varIndex') < ora:countNodes('inputVariable','payload','/client:To/client:Email') + 1">
    <assign>
    - first copy rule
    varTemp = concat(bpws:getVariableData('varTemp','payload','/client:emailCollection')," ",bpws:getVariableData('inputVariable','payload','/client:To/client:Email[position() = bpws:getVariableData('varIndex')]'))
    -second copy rule
    bpws:getVariableData('varIndex') = bpws:getVariableData('varIndex') + 1
    </asign>
    </while>

  • No batch input data for screen SAPMSSY0 0120 error while running LSMW

    I created Batch Input Recording for custom check register screen. I completed all the necessary steps but when I process the batch input session, my upload fails and gives me the following error:
    No batch input data for screen SAPMSSY0 0120
    I ran the Batch input in foreground to see what data am I missing and found out that one of the screen required a "DOUBLE CLICK" as user input and seems that LSMW is not able to process the "DOUBLE CLICK" (even though Batch Input recording captured the "DOUBLE CLICK") to move forward with the transaction.
    Does anyone one know How can I overcome this issue or any other workaround.
    Thanks Nirankar

    Hi Chetan,
    List screens utilize SAPMSSY0/0120. Please check within your ABAP coding if you have any customer exit that tries to issue WRITE statements. If so, either remove them or add logic to not issue them in background (e.g. SY-BATCH = 'X'). Another culprit might be a SUBMIT statement in a user exit, which triggers a report with list output. I'm not completely sure, but I think the screen is also used for some value helps (do you have any coding forcing a value help popup?).
    Hope this helps, harald

  • Error ''No batch input data for screen SAPMF05A 0701'' with interface

    Hi,
    While I ran an interface from our legacy (vendor invoice), this message appeared in AL11 ''No batch input data for screen SAPMF05A 0701''  I saw in the pass ''No batch input data for screen SAPMF05A 0700''  but never 0701.  I'm not able to find a lot of documentation that help me.  Does anyone know what can cause this?

    HI,
    You can check by yourself the reason of the error in the following way,
    related to note 26050.
    Please check all the 3 points and furthermore please follow the
    third that says:
    1.  Call up Transaction FBV0 and branch to the list.
    2.  Select the parked document in which the termination occurs, and
         create a batch input session via 'Edit -> Create batch input'. Then
         process the batch input session in the foreground.
    Processing the batch input in foreground, You will be able to find out
    the reason of the error message You have.
    So, please call transaction FBV0   > Push the button "Document list"   >
    Execute then choose the document You want to post and push the button
    "Create batch input"    > When the batch input session is created, go
    to transaction SM35 and run it in foreground in order to find out
    the problem.
    Kind Regards,
    Fernando Evangelista

  • Option checkbox 'Dynpro standard size' in processing a batch

    Hi All,
    i have created a batch for creating Sales orders using BDC
    session method.
    In recording of VA01 to ascertain the no. of rows of the table control of the item level data we found that it is 2 in all resolutions.
    When i process the batch in foreground with the option 'Dynpro standard size' checked, it runs fine.
    Here the functionality is incorporated such that after a item is entered the control passes to the 2nd line of the table control visible in the foreground .Hence the new item is always entered after the previous item.
    But when the user unchecks the 'Dynpro standard size' option & processes it, it is observed that the table control has 5 lines.
    So the new item is being overwritten over the second line item.
    Please suggest a workaround for this problem.
    Thanks in advance ,
    Sharat.

    Hi Sharat,
       In your BDC_INSERT function module, there will be an option for defsize for screens.
    i.e: pass CTUPARAMS-DEFSIZE = 'X' to the fm and see if it helps.
    Regards,
    Ravi

  • Controlling batch production date when using PI sheets

    Hello,
    The batch production date for some products is first entered at process order release. However, the GR is done using PI sheets. The GR entry via PI sheets takes the current date as the manufactured date and overwrites the old batch production date in the batch
    Does anyone out there know if there is a way to control this so that old production date is not overwritten with the GR entry date?
    many thanks.

    Dear,
    For this you need to use the setting..Batch classification in foreground for the movement type. Do the setting in OMJJ.
    Coem back after you try.

  • 6534 DAQ card- i want to process continous stream of data....through DAQ card...

    I have NI -6534 DAQ card.i have continous stream of 16-bit parallel data. I use two port as input port .(16 bits parallel data), such that i recieve  12000 bytes per second in my card buffer.  i  use  DIO read.vi. then on this bunch of 12000 bytes, i do some processing and display a XY chart/waveform. 
    code structure is as follows:
    1) DIO config. vi
    2) DIO start. vi
    3) while loop
        {          DIO read.vi (reads 12000/or fixed number of bytes, inserts into an array )
                   Insert array.vi
                   while loop(this processes data till end of 12000 bytes,then stops)
                                      stacked sequence(this processes the 12000 bytes data, )
                                        (      seq 0: intialize counters,
                                                seq 1: extract some pre-defined bytes,convert and display on frontpanel
                                                seq 2: extract specified bunch of bytes, create  xy chart/waveform
                                 }(12000 bytes data ends)
         }(16-bit parallel data is unavaible/or user control stop)
    4)dio clear.vi
    the problem persceptable here is , that processing data takes time.. few seconds for processing bucnh of data..then goes back to DIO read to fetch new bunch of data. till then the front panel waveforms and digital displays are on hold/static.
    I want to process data simulatanouesly , and also being recieve in buffer through DAQ.real time processing.
    simulatanoeus aqcuuistion..and process on continous data rather than in bunch of data.(i.e. TWO vi. should run in paralell  DIO READ.vi acqusition...and process.. on that data(vi) )
    Can queue.vi slove this problem? any other method
    your help is greatly appreciated..Thanks.

    Hi,
    This is definitely something that can be solved using a producer consumer loop.  You would have two loops running in parallel, have all of your data acquisition/generation done in the producer loop.  Use queues to transfer all that data from the producer loop to the consumer loop and then do any kind of processing in your consumer loop.
    The producer and consumer loop will run in parallel, and the consumer loop cannot run faster that the producer loop.
    The good thing is that you can get a producer consumer template easily.  In the LabVIEW startup window, select New (not New VI) >> From Template >> Frameworks >>Desgin Patterns >> Producer/Consumer Design Pattern (Data).
    Hope that helps.
    Regards,
    Raajit L
    National Instruments

  • Table for batch determination date

    hi,
    im printing the Dispensing slip...they want the output based on the batch determination date..
    can any one tel me in which table i willl get batch determination date?
    Regards
    Smitha

    Are you using a separate batch selection class versus the batch class?
    There are characteristics that must be in the selection class, but cannot be in the batch class.
    See OSS note 33396.
    1.  If you want to search for batches on the basis of a remaining shelf
             life in batch determination, the system has to calculate a requested
             shelf life expiration date dynamically from the information you
             give.
         2.  Add characteristic LOBM_VFDAT to the batch classes.
             For the revaluation of reference characteristics, refer to Note
             78235.
             Characteristics LOBM_RLZ and LOBM_LFDAT must not be included in the
             batch classes!
         3.  Add characteristics LOBM_VFDAT, LOBM_LFDAT, and LOBM_RLZ to the
             selection classes.
             Maintain a remaining shelf life in the strategy records for batch
             determination. Relational operators (> , < , <= , >=) are considered
             in the dynamic calculation of the shelf life expiration date in
             batch determination.
    From your note you have placed LOBM_RLZ  and LOBM_LFDAT in your batch class.  You cannot do that.
    FF

  • How to process the Received Idoc in SAP R/3 ? What to be done ?

    Hi All
    I am working for file to Idoc scenario.....
    Idoc received into SAP R/3 but how to process the Idoc data ?
    Such it will store in SAP R/3 DB.....
    Clearly
    How to Process the received idoc data into SAP R/3 ? (this is for inbound idoc)
    I hope any one can help me on the processing steps ?
    Waiting for valuable inputs form experts
    Regards
    Rakesh

    rakesh
    chec <a href="https://www.sdn.sap.comhttp://www.sdn.sap.comhttp://www.sdn.sap.com/irj/servlet/prt/portal/prtroot/docs/library/uuid/cdded790-0201-0010-6db8-beb9bb2b2660">Sample IDoc</a>
    normally, based on the idoc types it will get processed. if it is an idoc with master record it will create appropriate master records or if it based for a transaction it will create one.
    <a href="https://www.sdn.sap.comhttp://www.sdn.sap.comhttp://www.sdn.sap.com/irj/servlet/prt/portal/prtroot/docs/library/uuid/d19fe210-0d01-0010-4094-a6fba344e098">https://www.sdn.sap.comhttp://www.sdn.sap.comhttp://www.sdn.sap.com/irj/servlet/prt/portal/prtroot/docs/library/uuid/d19fe210-0d01-0010-4094-a6fba344e098</a>

  • Basics for Batch Classification Data

    Hello Experts ,
    I have few queries for Batch Classification Data .
    1. Which are the tables where Batch Classification Data for the batches is stored .
    2.Which Function module is called once we do the Goods movement which determines the batch
    classification data .
    3.When we do stock transfer for batches , then the Batch classification will run in Issuing plant and
    the batch classi. data will be copied in the receiving plant . Can you please tell me that which object
    carries this data from issuing plant to receiving plant.
    4.There is also a user exit with which we can play with Batch classification , can you please give some information on this .
    Thank you very much for the help !
    Regards
    Shashank

    1. The table is AUSP - but why would you need to know that. There are good function modules to query, read, change classifications of batches - so why would you like to read directly from DB?
    2. This depends on your release. Typicall in MIGO the module VB_CREATE_BATCH is called, the same one that is called within the BAPI to create batches.
    3. See 2. When you create a batch with reference, you call the same module additionally with the key of the reference batch (this is the issueing batch). 
    Inside that module are loads of user exits for all kind of purposes.
    Why not have a look yourself in Customizing at Logistics General -> Batch Management -> Batch Valuation.
    There's the "customer exits for goods movements in inventory management" (this is called e.g. in the goodsmovement bapi and the old transactions)  --> EXIT_SAPMM07M_004
    and the  "Valuation of Creation of New Batches Using Function Module" (this is called in the MSCxN and in MIGO) --> EXIT_SAPLV01Z_014 .
    In Customizing there's also the documentation what you can achieve with the exits.
    Hope this helps.
    Stefan

  • Change Batch creation date in MCH1

    Hi all,
    I have one requirement from our users.
    They want to change the batch creation date in mch1 table.
    Is it advisable to do that?
    Please give your suggestions.
    Regards,
    sunny

    Hi
    Check with Transaction code MBC2.
    Regards
    Srilaxmi

Maybe you are looking for