HOw to write/Develop this Start & Field Routine

Hi Experts,
As i am new(Learner) to BW Please advise me on how can i achieve this and update me with Releavent Start Routine and Field Routine....please
My Requirment is
Employee is Compounded on Location.
On Weekly or Monthly basis (dependending on Employee Payroll run)  Employee will be assigned with the Wage Type and Amount for that Wage Type and Payroll Date (When the payroll was run)
Data Currently i have/Data comming from Source System
Loc_ID--Emp_IDWage_IDPayroll_Date-Amount
-1--99900108.08.2008-----100.00
-1--99908808.08.2008-----560.00
-1--99934508.08.2008-----437.00
-1--99900108.07.2008-----654.00
-1--99908808.07.2008-----389.00
-1--99934508.07.2008-----893.00
-1--99926408.06.2008-----600.00
-1--99934508.08.2008-----365.00
(Employee may have Different Wage_ID and Amount for each payroll)
My requirment is to include a new key figure 'Previous_Amount' which will be populated previous Wage_ID Amount.
Loc_ID--Emp_IDWage_IDPayroll_Date-Amount---Previous_Amount
-1--99900108.08.2008---100.00-----654.00
-1--99908808.08.2008---560.00--
389.00
-1--99934508.08.2008---437.00--
893.00
-1--99900108.07.2008---654.00--
0
-1--99908808.07.2008---389.00--
0
-1--99934508.07.2008---893.00--
365.00
-1--99926408.06.2008-----600.00
-1--99934508.08.2008-----365.00
As i am a starter in BW i am struggling to write start routine in transformations (DSO-->CUBE) to transfer the data in DSO Active Table to a internal table and a field routine to update Previous_Amount field by sorting the internal table data and to pick employee's latest record less than the current payroll  for that particular wage_id and populate that amout to Previous_Amount field.
Please make necessary corrections to the start routine by fixing where i went wrong and update me with the required field routine (which will read data from internal table used in start routine.
Start Routine
Data: ITAB_DSOP type table of /BIC/AZDFREW200,
      WA_DSOP type /BIC/AZDFREX200.
Data: WA_PACKAGE like line of SOURCE_PACKAGE.
Data: L_TABIX type SY-TABIX.
If not SOURCE_PACKAGE[] is initial.
Sort SOURCE_PACKAGE by
/BIC/ZLOC                          ascending
/BIC/ZEMP_ID                      ascending
/BIC/ZPAY_DATE                     ascending
/BIC/ZWGE_ID                       ascending.
clear ITAB_DSOP[].
Select  from /BIC/AZDFREW200 into table ITAB_DSOP
for all entries in SOURCE_PACKAGE
    Where  /BIC/ZLOC      = SOURCE_PACKAGE-/BIC/ZLOC.
           /BIC/ZEMP_ID  = SOURCE_PACKAGE-/BIC/ZEMP_ID.
           /BIC/ZWGE_ID   = SOURCE_PACKAGE-/BIC/ZWGE_ID.
           /BIC/ZPAY_DATE < SOURCE_PACKAGE-/BIC/ZPAY_DATE.
If sy-subrc = 0.
Sort ITAB_DSOP by
/BIC/ZLOC                          ascending
/BIC/ZEMP_ID                      ascending
/BIC/ZPAY_DATE                     descending
/BIC/ZWGE_ID                       ascending.
if sy-subrc = 0.
loop at SOURCE_PACKAGE into wa_package.
l_tabix = sy-tabix.
read table ITAB_DSOP into wa_DSOP with key
/BIC/ZLOC = wa_package-/BIC/ZLOC
/BIC/ZEMP_ID = wa_package-/BIC/ZEMP_ID
/BIC/ZWGE_ID  = wa_package-/BIC/ZWGE_ID
binary search.
if sy-subrc = 0.
Please update me with Start Routine and Field Routine required  to achieve my requirment
Thanks in advance

hii
    I created small example as ur case, for that I will explain my logic which i used and i will give the code make the necessary mofications and use in ur case
1) First bring all the input that exist in source_package into a temporary table which we create as that of the output structue.
2) Sort both input i.e source_package and temporary table by wage id and date ascending
3) Next compare both of them and get the value of previous price into the temporary table from the source_package..
4)write a routine at the field level to populate the previous_price by mappign both wage id and payroll date
5) Between 3 step and 4 step i used one more temporary table to get the values of previous price 0
GLOBAl PART
  types : begin of itabtype,
           job_no type /bic/oirt_jobno,
           wage_id type /bic/oirt_wgid1,
           date type  /bic/oizfrm_dt,
           prc  type  /bic/oirt_prc,
           previous_price type /bic/oirt_prc1,
        end of itabtype.
data : itab type standard table of itabtype
                       with key wage_id date,
            wa_itab like line of itab.
data : itab1 type standard table of itabtype
                       with key wage_id date,
            wa_itab1 like line of itab1.
\* Create an internal table with all the field types with u want to have in the output */
LOCAL PART
  data : wa_SOURCE_PACKAGE type tys_SC_1.
  data : tmp(2) type n value 1.
   loop at SOURCE_PACKAGE into wa_SOURCE_PACKAGE.
         move wa_SOURCE_PACKAGE-/bic/rt_jobno to wa_itab-job_no.
         move wa_SOURCE_PACKAGE-/bic/rt_wgid1 to wa_itab-wage_id.
         move wa_SOURCE_PACKAGE-/bic/zfrm_dt to wa_itab-date.
         move wa_SOURCE_PACKAGE-/bic/rt_prc to wa_itab-prc.
         append wa_itab to itab.
    endloop.
  \* The above loop is to get all the values into the internal table*/
    sort itab by wage_id ascending date descending.
    sort SOURCE_PACKAGE by /bic/rt_wgid1 ascending /bic/zfrm_dt
    descending.
    loop at itab into wa_itab.
            tmp = '1' .
             loop at SOURCE_PACKAGE into wa_SOURCE_PACKAGE from tmp.
                  if  wa_itab-date gt  wa_SOURCE_PACKAGE-/bic/zfrm_dt
                  and  wa_itab-wage_id = wa_SOURCE_PACKAGE-/bic/rt_wgid1
                      wa_itab-previous_price =
                      wa_SOURCE_PACKAGE-/bic/rt_prc.
                      tmp = tmp + 1.
                      exit.
                   endif.
               endloop.
     modify  itab from wa_itab.
    endloop.
    itab1[] = itab[].
    sort  itab1 by wage_id ascending date ascending.
    delete adjacent duplicates from itab1 comparing wage_id.
    loop at itab1 into wa_itab1.
       wa_itab-previous_price = '0'.
       modify itab1 from wa_itab1.
    endloop.
    loop at itab into wa_itab.
          loop at itab1 into wa_itab1.
                 if wa_itab1-date = wa_itab-date and wa_itab1-wage_id = wa_itab-wage_id.
                         wa_itab-previous_price = wa_itab1-previous_price.
                         exit.
                  endif.
            endloop.
       modify itab from wa_itab.
      endloop.
This looks big but logic is small....
get back to me if any queries u have
regards
vamsi

Similar Messages

  • At start up I get a request to install a bitdefender extension to safari. I am sure this is a virus as the link is not familiar, how do I remove this start up request

    At start up I get a request to install a bitdefender extension to safari. I am sure this is a virus as the link is not familiar, how do I remove this start up request

    Hi Pero127,
    What I mean is was this a package installer, or an actual extension?
    If just an extension, when you double click one, the OS asks if you want the extension installed. In which case, yes, that is the only item installed on your computer and you can then just uninstall it.
    But if it was an application installer package, then it could have installed many more items onto your computer than just the extension. The rest will still be on your drive after removing the extension.

  • Start/Field Routine in Transformations (ABAP)

    Hi Experts
    Please update me on how to proceed and ABAP Code required.
    DSOFULL->CUBE
    DSO Active Table: /BIC/AEMPPED00
    Data In DSO
    Emp_ID--Emp_SDATE--Emp_TDate
    1-----01.01.2008
    1---01.01.2008--01.06.2008
    1-----01.01.2008
    Data Expected in CUBE
    Emp_ID--Emp_SDATEEmp_TDate-----Ter_Date
    1---01.01.2008--
    01.06.2008
    1---01.01.200801.06.2008---01.06.2008
    1---01.01.2008--
    I need a start routine or field routine that
    Will loading data from DSO to CUBE the code should look for employee latest record and if it find Emp_TDatevalue then it should populate both recordsTer_date in cube with  Emp_TDate
    Please advise

    Hi,
    i am providin you a sample code please modify it (field name and tables name's as per your requirement).
    Please write the code in transformation rule of field Emp_TDate.
    Map field Emp_SDATE  to the target field for Emp_TDATE .
        SELECT * FROM /BIC/AEMPPED00
                        WHERE Emp_SDATE NE ' '.
    if sy-subrc is initial.
    result = source_field-Emp_SDATE.
    else.
    result = ' '.
    endif.
    Please replace the emp_SDATE field with the source field name.
    But still i have some question...
    1. On what basis u decide the latest record ??
    Can u please explain scenarion bit mroe clearly.
    Thanks
    Dipika
    Edited by: Dipika Tyagi on Jun 24, 2008 8:47 AM

  • HT201317 since I have down loaded the iOS7 software my photos are no longer up loaded to my laptop. How can I make this start happeing again?

    Since I have downloaded the iOS7 software my photos no ,longer automatically sync and upload to my lap top. How can I make this happen again?

    Hi missvashenca,
    Thanks for using Apple Support Communities.  If your photos are not appearing on your computer from your iOS device anymore through Photo Stream, I'd first verify that the Photo Stream service is still enabled on your iOS device:
    iCloud: Set up Photo Stream
    http://support.apple.com/kb/PH2605
    If it is, you may also want to try switching it off and back on again.
    Likewise, I would also try this with your computer's Photo Stream settings.
    As always, I recommend backing up your photos before performing troubleshooting:
    iOS: How to back up and restore your content
    http://support.apple.com/kb/ht1766
    Cheers,
    - Ari

  • How to write FQL in querytext field in SharePoint 2013 Search REST API?

    I am having double quote symbol issue in my search app, same as http://www.silver-it.com/node/127 says.
    To avoid this issue, I am trying to understand: is it possible to do search request in FQL to the 2013 Search REST web service?
    I know that we can use KQL in querytext, like
    http://sp2013/sites/search/_api/search/query?querytext='Title:"123#123"'&enablefql=false&rowlimit=100&selectproperties='Title'
    But is it possible to write it in FQL? I found that there is a enableFQL property we can set in REST API. What's the function of it?
    http://sp2013/sites/search/_api/search/query?querytext='test'&enablefql=true&rowlimit=100&selectproperties='Title'
    http://sp2013/sites/search/_api/search/query?querytext='string("test case",mode="simpleall")'&enablefql=true&rowlimit=100&selectproperties='Title'
    Right now, the two above requests just give me this error.
    HTTP/1.1 500 Internal Server Error
    {"error":{"code":"-1, Microsoft.Office.Server.Search.REST.SearchServiceException","message":{"lang":"en-US","value":"We didn't understand your search terms. Make sure they're using proper
    syntax."},"innererror":{"message":"We didn't understand your search terms. Make sure they're using proper syntax

    http://sp2013/sites/Search/_api/search/query?querytext='string("cat+dog+fox",+mode="and")'&enablefql=true&rowlimit=100&selectproperties='Title'&sourceid='ad5a2ca4%2D91eb%2D44de%2D98f7%2D9af1c1eefef3'
    The response is:
    HTTP/1.1 400 Bad Request
    {"error":{"code":"-1, Microsoft.SharePoint.Client.InvalidClientQueryException","message":{"lang":"en-US","value":"Guid should contain 32 digits with 4 dashes (xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx)."},"innererror":{"message":"Guid should contain 32 digits with
    4 dashes (xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx).","type":"Microsoft.SharePoint.Client.InvalidClientQueryException","stacktrace":"   at Microsoft.SharePoint.Client.Rest.EdmClientValue.ConvertTo[T]()\r\n   at Microsoft.Office.Server.Search.REST.SearchServiceServerStub.query_MethodProxy(SearchService
    target, ClientValueCollection xmlargs, ProxyContext proxyContext)\r\n   at Microsoft.Office.Server.Search.REST.SearchServiceServerStub.InvokeMethod(Object target, String methodName, ClientValueCollection xmlargs, ProxyContext proxyContext, Boolean&
    isVoid)\r\n   at Microsoft.SharePoint.Client.ServerStub.InvokeMethodWithMonitoredScope(Object target, String methodName, ClientValueCollection args, ProxyContext proxyContext, Boolean& isVoid)\r\n   at Microsoft.SharePoint.Client.Rest.RestRequestProcessor.InvokeMethod(Boolean
    mainRequestPath, Object value, ServerStub serverProxy, EdmParserNode node, Boolean resourceEndpoint, MethodInformation methodInfo, Boolean isExtensionMethod, Boolean isIndexerMethod)\r\n   at Microsoft.SharePoint.Client.Rest.RestRequestProcessor.GetObjectFromPathMember(Boolean
    mainRequestPath, String path, Object value, EdmParserNode node, Boolean resourceEndpoint, MethodInformation& methodInfo)\r\n   at Microsoft.SharePoint.Client.Rest.RestRequestProcessor.GetObjectFromPath(Boolean mainRequestPath, String path, String
    pathForErrorMessage)\r\n   at Microsoft.SharePoint.Client.Rest.RestRequestProcessor.Process()\r\n   at Microsoft.SharePoint.Client.Rest.RestRequestProcessor.ProcessRequest()\r\n   at Microsoft.SharePoint.Client.Rest.RestService.ProcessQuery(Stream
    inputStream, IList`1 pendingDisposableContainer)"}}}

  • Plz help me how to write generate this Report

    hello all,
           Can anybody tell me how to do report  to display the customeru2019s daily invoice, over due not paid Invoice. 
         atleast help me wht tables should be used and how should do coding for not paid invoice ..............
        waiting for ur answers,
                                                               thanking you in advance,
                                                                                    yours sinceraly,
                                                                                    venkatesh babu

    Hi,
    Customer's daily invoice - VBRK & VBRP tables.
    Invoice cleared not cleared: BSAD(Cleared) & BSID(Not cleared)
    Formulate your logic with this info.
    Please search SCN before posting generic questions & close this thread.
    Regards,
    Amit

  • How do i make this a sub routine

    I have a little script to drill down folders and add comments to every folder and ever file inside the parent folder. It works perfectly well when run but itself. This was based heavily on the script found by NovaScotian here http://discussions.apple.com/thread.jspa?messageID=2182916&#2182916
    I basically want to make this script into a sub routine, and i have tried various methods and havent been able to make it work.
    global the_comments
    global each_item
    global pete
    global ddp
    global ddp2
    set ddp to {}
    set ddp2 to {}
    set theLetter to "s"
    set newDroppedFiles to choose folder with multiple selections allowed
    to drilldown(afolder)
    tell application "Finder"
    set file_list to files of a_folder
    set folder_list to folders of a_folder
    end tell
    repeat with i in file_list
    processDrilledDownFile(i)
    tell application "Finder"
    set comment of pete to text of the_comments
    end tell
    end repeat
    repeat with i in folder_list
    copy i to end of ddp2
    drill_down(i)
    end repeat
    end drill_down
    on processDrilledDownFile(a_file)
    copy a_file to end of ddp
    end processDrilledDownFile
    set the_comments to (do shell script "/usr/bin/defaults read com.Rich.move " & theLetter & "Text")
    repeat with each_item in newDroppedFiles
    set pete to each_item
    tell application "Finder"
    set comment of each_item to text of the_comments
    end tell
    drill_down(pete)
    end repeat
    repeat with each_item in ddp
    tell application "Finder"
    set comment of each_item to text of the_comments
    end tell
    end repeat
    repeat with each_item in ddp2
    tell application "Finder"
    set comment of each_item to text of the_comments
    end tell
    end repeat
    regards
    Rich

    Your script already includes several subroutines, so the trick is separating those out first.
    Essentially your script consists of three elements - global declarations, your own subroutines and a run handler. Since there's no explicit 'on run', everything that's not inside a subroutine becomes the run handler.
    Therefore what you should do is strip out all the code that's not a global declaration or an existing handler and just wrap that in a new 'on subroutinename()... end subroutinename' block.
    For sanity's sake I'd also add a specific run handler to bind it all together.
    This means you'll end up with something like:
    <pre class=command>global the_comments
    global each_item
    global pete
    global ddp
    global ddp2
    on run
    my myNiftySubroutine()
    end run
    to drilldown(afolder)
    tell application "Finder"
    set file_list to files of a_folder
    set folder_list to folders of a_folder
    end tell
    repeat with i in file_list
    processDrilledDownFile(i)
    tell application "Finder"
    set comment of pete to text of the_comments
    end tell
    end repeat
    repeat with i in folder_list
    copy i to end of ddp2
    drill_down(i)
    end repeat
    end drill_down
    on processDrilledDownFile(a_file)
    copy a_file to end of ddp
    end processDrilledDownFile
    on myNiftySubroutine()
    set ddp to {}
    set ddp2 to {}
    set theLetter to "s"
    set newDroppedFiles to choose folder with multiple selections allowed
    set the_comments to (do shell script "/usr/bin/defaults read com.Rich.move " & theLetter & "Text")
    repeat with each_item in newDroppedFiles
    set pete to each_item
    tell application "Finder"
    set comment of each_item to text of the_comments
    end tell
    drill_down(pete)
    end repeat
    repeat with each_item in ddp
    tell application "Finder"
    set comment of each_item to text of the_comments
    end tell
    end repeat
    repeat with each_item in ddp2
    tell application "Finder"
    set comment of each_item to text of the_comments
    end tell
    end repeat
    end myNiftySubroutine</pre>
    There's just one other observation - most of the globals are not actually needed since you can just pass them as parameters to the subroutines.
    I'm also not sure what the purpose of the script is - the upshot seems to be that you're setting the comment of items in the Finder, however, since you:
    <pre class=command> set comment of pete to text of the_comments</pre>
    and pete is a global pointing to the folder you're processing, surely this sets the comment of the folder x times, where x is the number of files in the folder - in other words if there are 100 files in the folder, you set the comment of the folder 100 times and don't touch the files themselves. Is that what you intend?

  • The volume to my laptop just completely, how do i get this started again?

    I have a silver HP Windows 7 Premium  laptop with intel core i5 and beat audio. The volume of my computer completely stopped. I have tried everything to restore its original settings- not sure what was changed! I can hear the log-in sound but my computer won't play any of my itunes songs or listen to youtube videos. I did not damage my computer so I am not sure as to what happened. I tried playing the volume test buttons but a caption comes up saying "Failure to test". I am not sure what is wrong. Please help.

    Hi Conscience1,
    Please post your Product number (please don’t post your serial number). Here is a link to help you find it:
    http://goo.gl/tLO9L
    ↙-----------How do I give Kudos?| How do I mark a post as Solved? ----------------↓

  • How can we do this ? (Can it be done using ASSIGN Componenet stuff)

    Hi,
    I am trying to validate the data entered by user. This is related to tax validation.
    Before the tax validation is carried out the data entered by user is valid or not is checked.
    After the user enters the values for field LAND1, BUKRS, KOART, STCD1, STCD2 ; the configuration table zconfig having structure LAND1, BUKRS, KOART and FNAME is checked.
    Now the business people will populate zconfig along with data for LAND1, BUKRS, KOART and FNAME. FNAME will have value of STCD1 and/ or STCD2.
    Now user enters CANADA, 567, D, 3456 for the fields <b>LAND1, BUKRS, KOART  and STCD1</b> (<b>This leaves STCD2 value blank.)</b>
    Now in config table for combination of CANADA, 567, D; the <b>value under FNAME is STCD1.</b>
    Hence
    data enterd by user is valid.
    EXIT.
    But if user enters CANADA, 567, D, 3456 for the fields <b>LAND1, BUKRS, KOART  and STCD2</b> respectively.
    <b>(Here STCD1 is blank.)</b>
    So when this data is being checked against config table the user data should be invalidated as config table has <b>STCD1 as value of FNAME for combination of CANADA, 567, D.</b>
    data entered by user is invalid.
    raise error.
    Can someone tell me how to do this ?
    How can we do this using field symbols ?
    Thanks..

    Hi,
    just a try:
    assumption : stcd1 or stcd2 is <b>obligatory</b> !
    assign (zconfig-fname) to <f>. "so <f> = stcd1 or stcd2
    if <f> is initial."the correct tax no is space !
    message e001(00) with 'false tax no.'.
    else.
    exit.
    endif.
    hope it helps
    Andreas

  • How to turn of "Automatic Start/Stop Detection for HDV Caputre

    Hello,
    I'm pulling my hair out trying to capture HDV 1080i-50 without getting separate clips created by the metadata at the start and stop of each shot. It's that feature called "Automatic Start/Stop Detection." I'm not talking about "Make new clip on timecode break."
    How can I turn this Start/Stop thing off. I don't want all these **** clips everytime the cameraman hit the record button.
    Anyone know how to turn it off so it stops making separate clips?
    Thanks!
    Kirk

    Yeah, Andy. It helps to have the manual!! THANKS. It says... Select or deselect "+Create new clip on Start/Stop+ checkbox in the Clip Settings Tab of the Log and Capture window."
    It's strange that I was looking for that kind of a box in all the various settings but not in the log/capture window.
    Kirk out.

  • How to write a start routine in the trasnformations ?

    Hi Experts,
    I am working on BI 7, As I want to write a start routine in transformations of 0FIGL_O02 DSO, to allow the GL accounts with cost center data. Already there is a delete statement please find.
    *DELETE SOURCE_PACKAGE where BAL_FLAG = 'X'. I had made comment to allow the G/L accounts. since I have some GL Accounts which does'nt have the cost center data, so in this case I have to write a ABAP code to allow this G/L accounts with cost center data.
    So Let me know if anyone can help me how to write the ABAP code in the start routine.
    Thanks
    sekhar

    Hi,
                Yopu can write the below lines of code adn try
        SORT source_package BY Receive nr  Type ASCENDING.
        DELETE ADJACENT DUPLICATES FROM source_package COMPARING Receive nr  Type.
    But you should make sure which record of the either rows need to deleted
    i.e in
    F9001;LU;J001;662;
    F9001;LU;J002;662
    You need to decide to eliminate 1st or 2nd one.(depends on your requirement)

  • How to write the start routine in the transformations ?

    Hi Experts,
    I am working on BI 7, As I want to write a start routine in transformations of 0FIGL_O02 DSO, to allow the GL accounts with cost center data. Already there is a delete statement please find.
    *DELETE SOURCE_PACKAGE where BAL_FLAG = 'X'. I had made comment to allow the G/L accounts. since I have some GL Accounts which does'nt have the cost center data, so in this case I have to write a ABAP code to allow this G/L accounts with cost center data.
    So Let me know if anyone can help me how to write the ABAP code in the start routine.
    Thanks
    sekhar

    Hi Experts,
    I am working on BI 7, As I want to write a start routine in transformations of 0FIGL_O02 DSO, to allow the GL accounts with cost center data. Already there is a delete statement please find.
    *DELETE SOURCE_PACKAGE where BAL_FLAG = 'X'. I had made comment to allow the G/L accounts. since I have some GL Accounts which does'nt have the cost center data, so in this case I have to write a ABAP code to allow this G/L accounts with cost center data.
    So Let me know if anyone can help me how to write the ABAP code in the start routine.
    Thanks
    sekhar

  • How to write an inverse routine for an info object at field level.

    Hi All,
    Our requirement is as follows:
    Need to populate 0MAT_PLANT in a virtual provider. But during query performance it is like -
    If someone were to add a filter to the BW query on 0MAT_PLANT, it would take the system awhile to search the APO database for that 18 CHAR material number because the data in APO is stored differently (40 CHARs).  The point of the  inverse routine is to convert the 18 CHAR to be 40 CHAR for those times when someone selects on 0MAT_PLANT.  That way, the field types match and finding the data in APO is much quicker.
    Hence if we need selection on the selection scree of the query for 0MAT_PLANT, we are trying for inverse routine which will convert 18 chars to 40 chars in the query level and the performance will be quicker.
    See, 0MAT_PLANT is of 18 Char length. We are mapping it from ZMATNR in source which is of 40 chars. Thus initially in the field routine we will write a code to convert 0MAT_PLANT to 18 from 40.
    Below is the code and the place we now we also need to write inverse routine to convert it to 40 from 18 chars.
    *****Code to convert from 40 chars to 18******
    *In APO if the material contains all integers values the material will
    * come over as a 40 byte field.  It needs to be shortened.
      IF SOURCE_FIELDS-/BIC/ZMATNR CO '0123456789'.
        short_material = SOURCE_FIELDS-/BIC/ZMATNR+22(18).
    *In APO if the material contains any character fields the material will
    * not be zero filled and will be left alligned. Take the 1st 18 bytes.
      ELSE.
        short_material = SOURCE_FIELDS-/BIC/ZMATNR(18).
      ENDIF.
    *add leading zeros to numeric only value
      IF short_material CO '0123456789 '.
         w_num18 = short_material.
         short_material = w_num18.
      ENDIF.
      RESULT = short_material.
    *$*$ end of routine -
    ***********here the iverse routine need to write************
    *       Method invert_0MATERIAL
    *       This subroutine needs to be implemented only for direct access
    *       (for better performance) and for the Report/Report Interface
    *       (drill through).
    *       The inverse routine should transform a projection and
    *       a selection for the target to a projection and a selection
    *       for the source, respectively.
    *       If the implementation remains empty all fields are filled and
    *       all values are selected.
      METHOD invert_0MATERIAL.
    *$*$ begin of inverse routine - insert your code only below this line*-*... "insert your code here
    Here you would write logic in 0MAT_PLANT to do the opposite of the above,
    convert 18 back to 40 characters.
    *$*$ end of inverse routine - insert your code only before this line *-*
      The articulated form above in RED says that the subroutine i.e. the inverse routine is implemented for direct access and report to report interface.
    Could anybody please help me or let me know any idea related to this.

    Hi All,
    Our requirement is as follows:
    Need to populate 0MAT_PLANT in a virtual provider. But during query performance it is like -
    If someone were to add a filter to the BW query on 0MAT_PLANT, it would take the system awhile to search the APO database for that 18 CHAR material number because the data in APO is stored differently (40 CHARs).  The point of the  inverse routine is to convert the 18 CHAR to be 40 CHAR for those times when someone selects on 0MAT_PLANT.  That way, the field types match and finding the data in APO is much quicker.
    Hence if we need selection on the selection scree of the query for 0MAT_PLANT, we are trying for inverse routine which will convert 18 chars to 40 chars in the query level and the performance will be quicker.
    See, 0MAT_PLANT is of 18 Char length. We are mapping it from ZMATNR in source which is of 40 chars. Thus initially in the field routine we will write a code to convert 0MAT_PLANT to 18 from 40.
    Below is the code and the place we now we also need to write inverse routine to convert it to 40 from 18 chars.
    *****Code to convert from 40 chars to 18******
    *In APO if the material contains all integers values the material will
    * come over as a 40 byte field.  It needs to be shortened.
      IF SOURCE_FIELDS-/BIC/ZMATNR CO '0123456789'.
        short_material = SOURCE_FIELDS-/BIC/ZMATNR+22(18).
    *In APO if the material contains any character fields the material will
    * not be zero filled and will be left alligned. Take the 1st 18 bytes.
      ELSE.
        short_material = SOURCE_FIELDS-/BIC/ZMATNR(18).
      ENDIF.
    *add leading zeros to numeric only value
      IF short_material CO '0123456789 '.
         w_num18 = short_material.
         short_material = w_num18.
      ENDIF.
      RESULT = short_material.
    *$*$ end of routine -
    ***********here the iverse routine need to write************
    *       Method invert_0MATERIAL
    *       This subroutine needs to be implemented only for direct access
    *       (for better performance) and for the Report/Report Interface
    *       (drill through).
    *       The inverse routine should transform a projection and
    *       a selection for the target to a projection and a selection
    *       for the source, respectively.
    *       If the implementation remains empty all fields are filled and
    *       all values are selected.
      METHOD invert_0MATERIAL.
    *$*$ begin of inverse routine - insert your code only below this line*-*... "insert your code here
    Here you would write logic in 0MAT_PLANT to do the opposite of the above,
    convert 18 back to 40 characters.
    *$*$ end of inverse routine - insert your code only before this line *-*
      The articulated form above in RED says that the subroutine i.e. the inverse routine is implemented for direct access and report to report interface.
    Could anybody please help me or let me know any idea related to this.

  • When starting Final Cut Pro 7.0.3 I am all of a sudden getting a message that says "One or more of the scratch disks don't have read/ write access" and now the app won't operate - how do I fix this so I can use Final Cut Pro?

    When starting Final Cut Pro 7.0.3 I am all of a sudden getting a message that says "One or more of the scratch disks don't have read/ write access" and now the app won't operate - how do I fix this so I can use Final Cut Pro?

    Glad you found the answer.  But something seems wrong.  FCP should be able to assign the scratch disk to your startup drive.  It's not advisable, but it should be possible.  You might want to try and figure out what's going on before what ever's going on cause other problems.

  • How to write a Globla Routine for Transfer Rules

    Hi Experts,
    How to write a Global Routine for Tranfer Rules ? Where all the infoobjects need to be grouped? Effect need to be shown on group of fields that are available in a single structure.
    Thanks in Advance
    Vara

    Hi,
    Are you aware of Start Routine available?.. This helps to write a routine on all the infoObjects available in Transfer Structure
    http://help.sap.com/saphelp_nw04/helpdata/en/b3/0ef3e8396111d5b2e80050da4c74dc/frameset.htm
    regards
    Happy Tony

Maybe you are looking for