Transformation logic for a scenario

Hi Experts,
Our scenario is:
In DSO1 against doc A, there will be doc's created like B,C,D.
Of these B,C,D only one will have follow up doc say E( E should be looked up on DSO2 with a common doc) created against doc C.
A -> B
A          -> C - > E
A         - > D
Requirement:
If any of the 3 i.e. B, C or D have a followup doc i.e. E, then populate a constant say 'W' to Info object 'Status' against all the 3 i.e.  B, C, D.
No rule that A, B, C, D, E will be created on the same day.
Can anyone please let me know how to achive this through a transformation from DSO1 where all the B,C,D will be there and the E should be looked up on other DSO2.
Thanks & Regards,
Bhadri M.

Hi,
First let me mention what I actually understood from your example. Your DSO1 has fields as below
Document1----Document2 Status
A----
B
A----
C
A----
D
Similarly DSO2 has fields
Document1----Document2
C----
E
What you want in DSO1 is below
Document1----Document2 Status
A--B--
W
A--C--
W
A--D--
W
If this is the case, it can be achieved but would be a bit complex.
Hopefully your datasource that fetches the data to DSO1 is a z datasource. Change your datasource logic in such a way that if for a document A followup document B is created on a day, it fetches the records for all the follow up documents that have been created for document A till date. i.e. A->B, A->C, A->D. This way the data coming to PSA would contain all the documents that have been created for A till date. Next, in the start routine you need to write routine to do following:
Select * from DSO2 for all entries in source_package where document2 of DSO1 = document 1 of DSO2 into INT1.
Then loop on source_package. Read from int1 for the document that are in source_package as follow up document. (i.e. to see whether C exists in int1). Once you find that a followup document exists in INT1, copy the main document from source_package into another internal table INT2 (copy A into it). This way in the loop you can collect all the main documents that have followup documents in DSO2.
After the loop ends, start the loop on source_package again.
Now, read whether document1 from source_package (A) exists in INT2 or not. If yes, update status as 'W'.
Thats it.

Similar Messages

  • Transformation logic for char and keyfigure from source keyfigures and flag

    Hi All,
       My requirement is populate char and keyfigure values from source keyfigures and flag, which is like transformation of converting the Key Figure based structure to the accounts based structure. I am loading data from the cube1 to cube2,
    cube1 structure with sample data:
    Plant Furnace ZFurnace ZPlant1 Zplant2 Flag
    P01     Blank      0                56        73      P
    Blank     F01      335               0           0       F
    Target Cube str with sample data
    Plant  Furnace    FS(Char)   KYF   Flag
    P01     Blank         1               56       P
    Blank  F01            2               335    F
    P01      Blank         3               73      P
    ZPlant1, ZPlant2 and ZFurnace are the keyfigure tech.names.
    FS has master data:
    FScode  KYF            Flag
    1             ZPlant1        P
    2             ZFurnace     F
    3             Zplant2         P
    While loading data from the source cube1 I need to read FS master data and than fill FS code in target cube based on source Flag and  Key figure technical names.
    I would be greatly appreciate with points if anyone can help me in writing the ABAP logic. The challenging part for me is comparing key figure technical names in cube1 with the FS master data key figure values.
    Thanks.
    Baba.

    Hi All,
       Actually there will be 18 records in FS master data and there will not more than 18 Key figures. Is there any way I can hardcore the values and write small code.
    something like:
    if KYF = Zplant1.
         WA_FSCODE  = '1'.
       WA_EU = source_fields-KYF.
    WA_PLant = source_fields-plant.
    wa_furnace = source_fields-furnace.
    else
        if kyf = zplant2
    Please let me know the sample logic with code. I greatly appreciate with points..
    Regards
    Baba

  • Programming logic for this scenario

    hi all,
    kindly help me with this scenario:
    i have a internal table with fields like this (among others)
    OBJEK                        ATINN         CHAR                                  CHARG            CHAR1
    000000000000000031 0000000188  Batchnumber: WEEK NO. 9  0000000052
    000000000000000031 0000000189  Visualinspection: OK            0000000052
    now wht i need to do is for SAME batch number i need to concatenate the values of CHAR into CHAR1.
    that is to say that CHAR1 shud have the value "Batchnumber: WEEK NO. 9 Visualinspection: OK"
    ive right now done it using 2 different internal tables and concatenating the values. want to know if theres an easier and simpler way.
    any pointers guys??
    pk

    solved it myself
    thanks to sujatha reddy's post in the following thread:
    Re: at end of  statement
    pk
    Edited by: prashanth kishan on Jul 11, 2008 9:19 AM

  • Require Fox-code logic for this scenario

    Hi folks,
    I have a multi-planning area consisting of one budget-planning cube and one actual cube. I am tryig to create a planning function using a Fox formula where-in  i need to map each Liquidity item in Actual cube with multiple fund centres in Budget cube and finaly save the data in budget cube.
    Can anyone please provide Fox- code for the above scenario?
    thanks in advance
    regards
    Peter

    Hi,
    What is the problem with the FOX?. Please provide more detail about what you want to do.

  • Process chain Logic for this scenario

    HI,
      We have a typical requirement that need to be implemented using a Process chain.
    We need to Load data from R/3 to PSA,
    then Load the data from the PSA into ODS1,
    then execute an ABAP program,
    then load the data from PSA into ODS2 (we cannot repull from R/3).
    Can some one provide me an Idea on how to implement this.  I am stuck with the idea on how to populate ODS2 from PSA after executing ABAP Program.
    Thanks In advance,

    Hi Krishnamohan,
    I hope i m getting ur problem right.I think u can implement the below steps easily :
    Step 1 : We need to Load data from R/3 to PSA,
    Step 2 : then Load the data from the PSA into ODS1,
    Step 3 : then execute an ABAP program,
    U have options avaialable for each of them in process chains.
    Now coming to your problem.
    Step 4 : then load the data from PSA into ODS2 (we cannot repull from R/3).
    i guess ur Abap program makes some changes in the Data of PSA which u then want to load in the ODS2.
    What u can do is create 2 infopackages and inside infopackages in the data targets tab select ODS1 in the first infopackage and uncheck ODS2 , likewise in the 2nd infopackage select ODS2 and  uncheck ODS1.
    U can execute the first infopackage in the process chain , then ur abap program and then
    In process chain and in Process types u have something called "Read PSA Data and Updata Data Target".
    select that process type , specify the 2nd infopackage name and get it executed.
    Hope it helps.
    Thanks and Regards,
    Parth.

  • Need Mapping logic for the following scenario

    Hi everyone,
    I need a mapping logic for the following scenario.
    For the same order no with same material no, the quantity should be summed and only one idoc should be created.
    For the same order no with different material no, no need to sum the quantity and only one idoc should be created.
    For example:
    Source Structure:
    Ord No      Mat No      QTY
    12               1               2
    13               1               3
    13               2               1
    12               2               4
    15               1               5
    14                3              7
    12               1              6
    Target Structure:
    Ord No      Mat No      QTY
    12               1               8
    12               2               4
    13               1               3
    13               2               1
    14               3              7
    15               1              5
    Thanks in Advance

    Try the graphical mapping as shown below using concat with a space as delimite and UDF to split the value again by space.
    1. Idoc node
    (RootContext)
    OrdNo
         |concat[ ] -> sort[asending] -> SplitByValue -> collapseContexts -> Idoc
    MatNo                case sensitive    [ValueChange]                              
    (RootContext)
    2. OrdNo
    OrdNo(RC)
         |concat[ ] -> sort[asending] ->SplitByValue->collapseContexts->SplitByVale-> UDF to fetch ordno  -> OrdNo
    MatNo(RC)           case sensitive    [ValueChange]                [eachValue]   (return var1.split(" ")[0];)
    3. MatNo
    OrdNo(RC)
         |concat[ ] -> sort[asending] -> SplitByValue ->collapseContexts->SplitByVale-> UDF to fetch ordno  -> MatNo
    MatNo(RC)              case sensitive    [ValueChange]                  [eachValue]   (return var1.split(" ")[1];)
    4. Qty
                   [asending,case sensitive]               
                   --  sortByKey -----> formatByExample -> sum ->Qty
    OrdNo(RC)           |          |          ^          
         |concat[ ] -> |            Qty(RC)          |
    MatNo(RC)           |                |     
                   --sort[asending]-> SplitByValue
                       case sensitive    [ValueChange]
    Regards,
    Sunil Chandra

  • I need to add the logic for posting key,

    I need to add the logic for posting key, if posting key = 50 then amount is credit (add negative sign in amount field).
    I creadted this scenario in mapping, We wanted to make sure that the negative sign was placed in the amount field if the posting key was = 50.  So it seems we are good.  We will just need to test.
    How i can test this case?
    Please do needful.
    Thanks in advance.

    Hi,
    You need to use an IfElse node function in your mapping in which the input should be the posting key from the source structure.
    You need to check in the IfElse statement whether the posting key=50, if it is true, then use concat node function to concat the negative sign("-") along with the original value to be passed to the amount field on the target side. Else, if the condition is false(i.e posting key!=50), then simply pass the value to the amount field as it is without adding the negative sign.
    After the mapping is done, go to the test tab and then insert the values in the source fields and then execute from the lower left button showing start transformation.  You need to test with the value of posting key=50 once and with the value not equal to 50 once.In the first case, the output amount field should have the negative sign, which should not be present in the second case.
    You will find your target structure in the righter half of your screen. That's it mate
    Thanks
    Biswajit

  • In MVC, do i need a View or Page with flow logic for POPUP window

    Hi All,
    I have the below scenario using the MVC pattern.
    I have a main view with 3 trays, each tray has two buttons, for example first tray has Create Order button. When I click on this button, I need a popup window to come with a tableview and a button(Create), where I select some rows and click on the button Create  to create order.
    But as per the MVC pattern I canu2019t call the view (popup) from another view(main view).  So should I create a VIEW or PAGE WITH FLOW LOGIC for the popup? .
    I need 6 popup to be called from the main view and once the function is done close the popup.
    Please suggest me the flow for this scenario.
    Cheers,
    Srini.

    Srini,
    1. You can call the view in pop-up because you will be calling the controller using open.window.
    Here is the sample code:
    method DO_REQUEST .
      data:
            li_vw           type ref to   if_bsp_page,
            lv_form_field   type          string,
            li_md           type ref to   zcl_model01.
      dispatch_input( ).
      li_md ?= get_model( 'm01' ).
      lv_form_field = request->get_form_field( 'invoice_create' ).
      if lv_form_field is initial.
    *------ Request to display main page
        li_vw = create_view( view_name = 'main.htm' ).
        li_vw->set_attribute( name = 'model' value = li_md ).
        call_view( li_vw ).
      elseif lv_form_field eq 'true'.
    *------ Request to display Invoice page in pop-up
        li_vw = create_view( view_name = 'invoice.htm' ).
        li_vw->set_attribute( name = 'model' value = li_md ).
        call_view( li_vw ).
      endif.
    endmethod.
    Layout:
          function do_Invoice()
          { var s=0; r=1; w=300; h=300; x=screen.width/2;
            x=x-w/2;
            var y=screen.height/4;
            y=y-h/2;
            popUp=window.open('main.do?invoice_create=true','win','width='+ w
            +',height='+ h +', left=' + x +',top='+ y +');
    Option2:
    Ofcourse you can't bind the model in page becos those are 2 different things. But all you need to do is access the model to get some value. To know how to access the model from Page w/flow logic look at [this link|Passing model reference to a page in a Popup].
    Raja
    Edited by: Raja Thangamani on Apr 14, 2009 11:22 AM

  • Logic for Deletion of a row is not reflecting in DEV instance

    Hi ,
    I have a method in AM attached to a Search screen which has logic to delete rows from 3 different VOs.
    1. VO for selecting the rows from Details table.
    2. VO for selecting the rows from Master table.
    3. VO for holding the rows of results tables.
    The results table displays rows from both master table and details table. On selecting a row for delete in results table , the remove() method is being called to remove rows from the three VOs in the same order mentioned above.
    This logic works perfectly in my local setup. The commit was initially not getting recognized. So , we included the line "JDBC\:processEscapes=true" after which delete and commit started working fine in my local setup.
    But the same code does not work in the DEV instance. Can anyone please suggest what can be done in such a case.
    Thanks,
    Chandrika

    Thanks for your reply,
    As we did registration again now for the problematic server autoreaction method is now visible in RZ20
    But it is not picking up the alert for the scenarios. Also found other 2 server is also giving error while connection test below is screen shot.
    I checked the log file. Foundthe agent lock is not getting updated.
    and error log details
    [Thr 4396]
    Fri Jun 27 08:12:44 2014
    INFO: Register central system PSM.
    INFO: Register central system: System PSM already registered as central system. Trying to update...
    ERROR: Register central system: cannot stop agent, because the agent is restarting.
    Fri Jun 27 08:44:56 2014
    INFO: Unregister central system PSM.
    ERROR: Unregister central system: cannot stop agent, because the agent is restarting.
    [Thr 7084]
    Fri Jun 27 08:52:07 2014
    INFO: Register central system PSM.
    INFO: Register central system: System PSM already registered as central system. Trying to update...
    ERROR: Register central system: cannot stop agent, because the agent is restarting.
    Can you please check and let me know the fix to resolve the error.

  • Logic for workflow

    Hi All,
    Kindly help me out with the logic building for this scenario:
    There is a customized workflow attached with Purchase Requisition approval. This workflow has an Org structure based on departments. The approver is determined based on admin id or the workflow initiator.
    Issue is : if the same user has to create a PR for different department, he/she will have to login with the admin's id for that department.
    Requirement : Same user should be able to create PRs for any department.
    So, is there a way of adding this condition with minimum change ?

    Hi Ameekar
    As I understand it, your problem is that the current agent determination process is to select the appropriate approver based on the particular department creating the requsition.
    One solution might be to change the agent determination for the approval task to select the approver based on an Organizational Unit rather than by department.
    Org units are sort of free standing HR groupings.  You can create an Org unit.  Assign people to the Org unit and then designate that org unit as the agent for the approval task.  Thus in your case you could create an org unit of Approvers then assign multiple people to it and requsitions would be routed to the org unit for approval regardless of Department.
    Some things to consider:
    1. Everybody in the org unit will see the task in their worklist - first one to open it reserves it and others cannot work on it unless it's replaced back into the worklist
    2. It's nice for when people go on vacation, automatically takes care of the workload by allowing others to execute workitems
    3. It has the ability to deal with part time people.  When you assign a person to an org unit you can specifiy their hours of availability or the percentage of their time involvement - system automatically takes this into consideration.
    Hope this helps,
    Brent

  • Logic for sending data files to multiple instances of Central

    We are using Central Pro Output Server 5.6 with a single Central instance as of default installation on a Windows Server 2003. Data for the transaction files coming from our iSeries system via a printer queue (\\.\pipe\jetform\queuename)
    Now we want to be able to produce more documents from Central much faster and therefore setting up multiple instances of Central. The problem is then where to put the logic for choosing instances.
    The simplest way to do this would be to have iSeries to alternate the data files to different pipes (printer queues) for Central. But as we dont want to change our iSeries configuration for this, is there a way to solve this problem in Central?
    Any help with this is much appreciated

    Central provides no mechanism that I'm aware of that would do anything resembling the job distribution that you are wanting. Central is written to monitor it's input folders and process the files it finds there. Each instance is essentially separate from each other.
    Your source system will need to select the appropriate instance to be used. Either that or you will need to have something between the source system and Central that is doing the distribution. For example, you could have the source system write to a folder that is not being monitored by Central and write a program that runs as a service that does monitor that folder. This program would then distribute the files. The likely drawback of having an intermediary program is that you are likely to not end up getting the documents printed any faster than with a single instance.
    Another possible way, if the source system can create files with different file name extensions, would be to have them all written to the same folder and have each instance checking that particular folder but looking for files with different extensions. This might be problematic, though, because it might also end up with each instance watching the same "control" folder so that doing things like pausing Central would end up with no control of which instance was going to be paused.
    The default setting for Central has it pausing for 5 seconds if it completes a job and there are no more files waiting for it. If your jobs are not coming in faster than Central is processing them then you would be getting some of this delay for your jobs. You could reduce this time or even set Central to process a job as soon as it shows up. I don't know if version 5.6 still has the problem but an earlier version would not "see" a file if it happened to show up exactly when it was looking for more (it was probably showing up milli-seconds after Central looked). This caused that job to just sit there until the next job showed up.
    A major factor in getting the documents produced faster is going to be the speed and number of printers that they are going to - plus the number of pages in each document. For us, the single instance of Central that we are currently using can produce print much faster than our HP9050 printers (50 ppm) can actually print it.
    You must be doing a lot of forms or each job is doing a lot of processing. Our typical print job does 4 tasks (depending on the job this can include things like: passing the file through the transformation agent, updating a mainframe database, FTPing the file to another server for archiving, and producing the print). A typical job with 11 output pages takes only 2-3 seconds. We have a task that runs every morning that retrieves mainframe generated jobs. I just checked one of our servers and it processed 208 jobs in exactly 8 minutes (26 jobs per minute at an average of 2.3 seconds per job). We also thought we'd need multiple instances due to speed but that just isn't the case for us. We have other reasons to move to multiple instances but speed in not a major factor any more.

  • Using macbook pro + logic for live - what to consider

    so i just received the macbook pro which I will use live with logic for playing back bounced audio tracks, audio instruments and live inputs + effects.
    I want to avoid (if possible) an additional external drive in order to keep the setup simple.
    I`m thinking of partitioning the internal drive into two parts and use one partition for the audio material.
    Or is it maybe better to keep the whole drive as one partition (80GB)?
    Any further suggestions? e.g. Which os do you think is more stable 4.6. or 4.7?

    Hi. I use my MBP live for Ivory Grand piano sounds and occasional backing tracks with live piano. I suggest you set it all up and suck and see. You will quickly get an idea of what will and won't work. Set up a scenario where it is doing a fair bit more work than you anticipate it doing live, and then you will be confident of it doing what you want it to do.
    Partioning a hard drive just means that the same single hard drive is trying to split it's attention to two places at once, and so will probably give worse performance. It is no real big deal to carry a small firewire drive these days. By the time you have plugged in MIDI interfaces and cable, an extra firewire cable will not make much difference. Samples stream much faster from an external drive. Audio should play back just fine from the internal hard drive.
    I am using 10.4.7 and I don't seem to be having any problems so far, touch wood.

  • ALE Configurations for IDOC2IDOC scenario

    Hi Experts,
    Can anyone list down wht r the ALE configurations are required for IDOC2IDOC scenario?
    1. Sender SAP ---> XI
    2. XI ---> Reciever SAP
    Please provide me the TCodes with the details that are needed for the above scenario?
    Your help is higly appreciated
    Rgds
    Faisal....
    Edited by: Abdul Faisal on Jul 28, 2008 11:49 AM

    Hey,
    Steps for ALE settings:-
    Steps for XI
    Step 1)
         Goto SM59.
         Create new RFC destination of type 3(Abap connection).
         Give a suitable name and description.
         Give the Ip address of the R3 system.
         Give the system number.
         Give the gateway host name and gateway service (3300 + system number).
         Go to the logon security tab.
         Give the lang, client, username and password.
         Test connection and remote logon.
    Step 2)
         Goto IDX1.
         Create a new port.
         Give the port name.
         Give the client number for the R3 system.
         Select the created Rfc Destination.
    Step 3)
         Goto IDX2
         Create a new Meta data.
         Give the Idoc type.
         Select the created port.
    Steps for R3.
    Step 1)
         Goto SM59.
         Create new RFC destination of type 3(Abap connection).
         Give a suitable name and description.
         Give the Ip address of the XI system.
         Give the system number.
         Give the gateway host name and gateway service (3300 + system number).
         Go to the logon security tab.
         Give the lang, client, username and password.
         Test connection and remote logon.
    Step 2)
         Goto WE21.
         Create a port under transactional RFC.(R3->XI)
         Designate the RFC destination created in prev step.
    Step 3)
         Goto SALE.
         Basic settings->Logical Systems->Define logical system.
         Create two logical systems(one for XI and the other for R3)
         Basic settings->Logical Systems->Assign logical system.
         Assign the R3 logical system to respective client.
    Step 4)
         Goto WE20.
         Partner type LS.
         Create partner profile.
         Give the outbound or inbound message type based on the direction.
                    In your Sender R3:- outbound
                    In your Receiver R3:- Inbound.
    Step 5)
         Goto WE19
         Give the basic type and execute.
         fill in the required fields.
         Goto IDOC->edit control records.
         Give the following values.(Receiver port,partner no.,part type and sender Partner no. and type)
         Click outbound processing.
    Step 6)
         Go to SM58
         if there are any messages then there is some error in execution.
         Goto WE02.
         Check the status of the IDOC.
         Goto WE47.
         TO decode the status code.
    Step 7)
         Not mandatory.
         Goto BD64.
         Click on Create model view.
         Add message type.
    BD87 to check the status of IDOC.
    In case if not authorized then go to the target system and check in SU53, see for the missing object
    and assign it to the user.
    SAP r3
    sm59(status check)(no message)
    WE02(status check)
    WE05(status check)
    BD87(status check)
    WE42 process code
    WE47 status info.
    <removed by moderator>
    regards,
           Milan
    Edited by: Mike Pokraka on Jul 28, 2008 5:04 PM

  • Logic for carry forward of previous stock to current period stock.

    Hi Experts,
    Client is using already MC.9 for see the stock analysis report, however as per there requirement we are exploring BOM as well as fetching quantity from table level as well. in my report I am experiencing  difficulty to carry forward previous period closing stock quantity to current period stock quantiy, if there is no received for current period, however the same is happens in MC.9
    Could any one tell me what is the logic behind MC.9 which is do carry forward previous period closing  stock to current period stock quantity in report.
    As per the requirement I am using S031, S032,S033, however unable to get logic for carry forward the previous month stock quantity to current month.
    have a requirement of creating a report of showing material stock period wise for each plant in below mention format.
    Header 1
    Header 2
    Header 3
    Header 4
    LFGJA/LFMON
    ROH (MT)
    HALB (MT)
    FERT (MT)
    11.2013
    100.000
    121.000
    121.00
    12.2013
    50.000
    12.000
    123.00
    01.2014
    23.231
    .23.234
    45.342
    02.2014
    23.231
    34.094
    45.098
    03.2014
    34.098
    98.983
    00.000
    04.2014
    00.000
    69.093
    98.098
    05.2014
    00.000
    89.098
    00.000
    For Example Break up of ROH material plant wise in below mention format.
    LFGJA/LFMON
    WERKS
    MENGE (MT)
    11.2013
    P001
    30.000
    11.2013
    P002
    50.000
    11.2013
    P003
    20.00
    Thanks in advance,
    SKN

    Hi,
       The last period closing stock = current period opening stock. You may get the details from MBEWH and S032 tables. Refer the doc: Material Stock and Valuation History tables - how to read them
    Regards,
    AKPT

  • MIRO MIGO ENHANCEMENT FOR INDIAN SCENARIO - IMPORTS PURCHASE ORDERS

    ENHANCEMENT IN THE MIRO ACTIVITY. IN INDIA YOU HAVE TO POST 7 CUSTOMS CONDITIONS FOR EACH ITEM. MIRO ACTIVITY IS COMPLETELY MANUAL. ALL THE 7 VALUES, QUANTITIES FOR 7 CONDITIONS ARE TO BE ENTERED. CAN THIS BE IMPROVED TO HANDLE IN A BETTER WAY.
    MIGO ACTIVITY FOR MANUFACTURING SCENARIO THE EXCISE VALUES ARE SET OFF WHICH ARE RETRIEVED BUT FOR TRADING SCENARIO WHERE THE EXCISE CONDITIONS ARE CUSTOMISED [ZCV1, ZEC1, ZEC2, ZAD1] TO LOAD ON INVENTORY THE EXCISE VALUES ARE TO BE ENTERED. CAN THIS ACTIVITY BE ENHANCED SO THAT THESE VALUES COME DIRECTLY FROM THE MIRO FOR DUTY POSTINGS WHERE ALL THE VALUES ARE ENTERED
    BEST REGARDS
    PANKAJ

    Hi,
    In a normal import process, before the goods receipt, you are making invoice verification for Customs using MIRO and using Planned delivery costs as an option.
    As the goods receipts doesn't exist, system is not proposing Customs conditions
    As the goods receipts quantity is unknown and there always a possibility of partial invoice verification for the customs, system will not directly populate the values in MIRO.
    Doing enhancements to populate this values may not be needed. Pls convince the user on the import process.
    Any other option available for that, really dont know
    Thanks

Maybe you are looking for