Transformation logic for char and keyfigure from source keyfigures and flag

Hi All,
   My requirement is populate char and keyfigure values from source keyfigures and flag, which is like transformation of converting the Key Figure based structure to the accounts based structure. I am loading data from the cube1 to cube2,
cube1 structure with sample data:
Plant Furnace ZFurnace ZPlant1 Zplant2 Flag
P01     Blank      0                56        73      P
Blank     F01      335               0           0       F
Target Cube str with sample data
Plant  Furnace    FS(Char)   KYF   Flag
P01     Blank         1               56       P
Blank  F01            2               335    F
P01      Blank         3               73      P
ZPlant1, ZPlant2 and ZFurnace are the keyfigure tech.names.
FS has master data:
FScode  KYF            Flag
1             ZPlant1        P
2             ZFurnace     F
3             Zplant2         P
While loading data from the source cube1 I need to read FS master data and than fill FS code in target cube based on source Flag and  Key figure technical names.
I would be greatly appreciate with points if anyone can help me in writing the ABAP logic. The challenging part for me is comparing key figure technical names in cube1 with the FS master data key figure values.
Thanks.
Baba.

Hi All,
   Actually there will be 18 records in FS master data and there will not more than 18 Key figures. Is there any way I can hardcore the values and write small code.
something like:
if KYF = Zplant1.
     WA_FSCODE  = '1'.
   WA_EU = source_fields-KYF.
WA_PLant = source_fields-plant.
wa_furnace = source_fields-furnace.
else
    if kyf = zplant2
Please let me know the sample logic with code. I greatly appreciate with points..
Regards
Baba

Similar Messages

  • The description for Event ID 0 from source VSTTExecution cannot be found. Either the component that raises this event is not installed on your local computer or the installation is corrupted. You can install or repair the component on the local computer.

    Hi,
    Can any one help me on the below issue ?
    while executing automated test case in test center it is showing as test case is in progress, then i navigated to virtual machine (where the test agent is online), there Internet
    explore has opened automatically and my test case execution started, but with in a seconds IE has closed . However, in test center, test case status is showing as in
    progress, after  2 or 3 min  some of  test cases getting passed and some are failed.
    Here my question is in lab center-->virtual machine, IE has opened automatically and navigated the 2 or 3  links and with in seconds IE has getting closed. Why IE is getting closed without completing execution?
    Error Message :
    The description for Event ID 0 from source VSTTExecution cannot be found. Either the component that raises this event is not installed on your local computer or the
    installation is corrupted. You can install or repair the component on the local computer.
    If the event originated on another computer, the display information had to be saved with the event.
    The following information was included with the event: 
    (mtm.exe, PID 5240, Thread 4) FileAggregatorSessionInfo: Error occurred while deleting temporary directoryException: System.IO.DirectoryNotFoundException: Could not find a part of the path 
       at System.IO.__Error.WinIOError(Int32 errorCode, String maybeFullPath)
       at System.IO.Directory.Delete(String fullPath, String userPath, Boolean recursive, Boolean checkHost)
       at Microsoft.VisualStudio.TestTools.Execution.Aggregation.FileAggregator.FileAggregatorSessionInfo.DeleteTemporaryDirectory(String temporaryDirectory)
    the message resource is present but the message is not found in the string/message table
    Thanks
    Suresh
    Suresh

    Hi,
    Can any one help me on the below issue ?
    while executing automated test case in test center it is showing as test case is in progress, then i navigated to virtual machine (where the test agent is online), there Internet
    explore has opened automatically and my test case execution started, but with in a seconds IE has closed . However, in test center, test case status is showing as in
    progress, after  2 or 3 min  some of  test cases getting passed and some are failed.
    Here my question is in lab center-->virtual machine, IE has opened automatically and navigated the 2 or 3  links and with in seconds IE has getting closed. Why IE is getting closed without completing execution?
    Error Message :
    The description for Event ID 0
    from source VSTTExecution cannot be found. Either the component that raises this event is not installed on your local computer or the installation is corrupted. You can install or repair the component on the local computer.
    If the event originated on another computer, the display information had to be saved with the event.
    The following information was included with the event: 
    (mtm.exe, PID 5240, Thread 4) FileAggregatorSessionInfo: Error occurred while deleting temporary directoryException: System.IO.DirectoryNotFoundException:
    Could not find a part of the path 
       at System.IO.__Error.WinIOError(Int32 errorCode, String maybeFullPath)
       at System.IO.Directory.Delete(String fullPath, String userPath, Boolean recursive, Boolean checkHost)
       at Microsoft.VisualStudio.TestTools.Execution.Aggregation.FileAggregator.FileAggregatorSessionInfo.DeleteTemporaryDirectory(String
    temporaryDirectory)
    the message resource is present but the message is not found in the string/message table
    We are using VS 2013 (premium) and IE 10.0,  scripts have been created on the same versions. It is working fine  on
    developers command prompt but the same tests are getting failed in lab center test agent machine
    test agent log:
    QTAgentService.exe, AgentService: calling AgentObject.RunEndFileCopyComplete
    QTAgentService.exe, AgentProcessManager.WaitForDataCollectionAgentProcessToStart: waiting for agents to start.
    QTAgentService.exe, AgentProcessManager.WaitForDataCollectionAgentProcessToStart: Agents started.
    QTAgentService.exe, AgentProcessManager.PerformActionIgnoringExceptions: Successfully called 'Cleanup' on the test agent
    QTAgentService.exe, AgentProcessManager.PerformActionIgnoringExceptions: Calling 'StopDataCollection(int)' on the data collection agent
    QTAgentService.exe, AgentProcessManager.IsDataCollectionAgentNeeded: IsExecutedOutOfProc? True
    QTAgentService.exe, AgentPro
    QTAgentService.exe, AgentProcessManager.WaitForDataCollectionAgentProcessToStart: Agents started.
    QTAgentService.exe, AgentService: Connection to controller is up.
    Thanks
    Suresh

  • The description for Event ID 8306 from source Microsoft-SharePoint Products-SharePoint Foundation cannot be found

    hi,
    can anyone please help me with the following:
    The description for Event ID 8306 from source Microsoft-SharePoint Products-SharePoint Foundation cannot be found. Either the component that raises this event is not installed on your local computer or the installation is corrupted. You can install or repair
    the component on the local computer.
    If the event originated on another computer, the display information had to be saved with the event.
    The following information was included with the event:
    The HTTP service located at http://localhost:32843/SecurityTokenServiceApplication/securitytoken.svc/actas is too busy.
    The publisher has been disabled and its resource is not avaiable. This usually occurs when the publisher is in the process of being uninstalled or upgraded
    _________________________________________________________ Fahad Khan

    Hi,
    Please try the following steps to troubleshoot your issue:
    1.      
    You can try to re-run SharePoint 2010 products configuration wizard to see any problems that still exist.
    2.      
    Go to IIS and see the status of SecurityTokenServiceApplicationPool service, whether it is stopped or not, restart the pool.
    3.      
    Go to manage web application services, review the status of Security Token Service application , try to restart.
    4.      
    In Central Administration>Security>Configure Service Account>Change the service account for the Security Token Service application to some other managed account.
    5.      
    If the issue persists, try the resolution in this blog:
    http://blogs.msdn.com/b/sowmyancs/archive/2010/07/16/sharepoint-2010-service-applications-bcs-metadata-access-service-are-not-working.aspx
    Let me know the result.
    Xue-Mei Chang

  • I am considering buying a new MAC laptop to run LOGIC for composition and band live/recording, but which one is best as I do not want to spend too much money? Does it have a line in and how do you monitor sound? Will I need adaptors and a interface?

    Can anybody help?
    I am considering buying a new MAC laptop to run LOGIC for composition and band live/recording, but which one is best as I do not want to spend too much money?
    Does it have a line in and how do you monitor sound?
    Will I need adaptors and an interface?
    Also, I am guessing as Logic only runs on MAC surely then they would not the best spec to recommend to run it?
    I see all the upgrades as additional memory or a faster process?
    Is a retina screen necessary, and why flash based storage against a 1TB hard drive, and a i5 instead of an i7
    The main reason for this purchase is to play live and use backing tracks and record found sounds and make creative songs.
    I hope you can provide some valuable feedback, as I am a longtime MAC user and see upgrades and changes happen regularly but the most important thing is the songs not the equipment.
    I have £500 already and willing to add another 500 to 700 pounds, then software extra.

    Can anybody help?
    I am considering buying a new MAC laptop to run LOGIC for composition and band live/recording, but which one is best as I do not want to spend too much money?
    Does it have a line in and how do you monitor sound?
    Will I need adaptors and an interface?
    Also, I am guessing as Logic only runs on MAC surely then they would not the best spec to recommend to run it?
    I see all the upgrades as additional memory or a faster process?
    Is a retina screen necessary, and why flash based storage against a 1TB hard drive, and a i5 instead of an i7
    The main reason for this purchase is to play live and use backing tracks and record found sounds and make creative songs.
    I hope you can provide some valuable feedback, as I am a longtime MAC user and see upgrades and changes happen regularly but the most important thing is the songs not the equipment.
    I have £500 already and willing to add another 500 to 700 pounds, then software extra.

  • Can one move or upload a garageband project done on an iPad into logic for further and finer tweaking?

    Can one move or upload a garageband project done on an iPad into logic for further and finer tweaking?

    Share your GarageBand iPad project to iTunes as a project. Then you can import it into GarageBand and upgrade it to a GarageBand Mac project. And Logic can open GarageBand for Mac projects.

  • Two Extractor logics for full and delta?

    Hi,
    I have a situation where in the logics for full upload and delta uploads are drastically dfifferent!
    What is best way to deal it?
    shuould i have two different extractors for full and delta ?
    Raj

    Hi Raj,
    I hope that u are working with the Generic Extractors. If the volume of the posting is too high then u need to restrict the data definitely from R3 side interms of the performance. And u need to maintain the Delta Functionality for the same.
    If u cannt maintain the Delta from the R3 side then atleast try to restrict the same from BW side at the infopack level where u can extract the data based on the changed on and created with the two different IP's and update the same to the ODS(here make sure all the objects are with the overwrite).... and then u can maintain the Delta load from ODS to the Cube easily..
    If the volume of the data is pretty less then u can use Full load to the ODS and then Delta to the Cube.
    Thanks
    Hope this helps..

  • Exchange rate translation logic for FI and CO

    Dear FI gurus
    I have question about exchange rate setting.
    - Controlling area currency: USD
    - Object currency/Company code currency: JPY
    - Transaction currency : JPY
    - Inverted exchange rate indicator : deactivated.
    (So, exchange rate is maintained by each country's subsidiary)
    In this case, according to SAP's help,
    - Controlling area currency: Converted from Transacton currency(EUR) to Controlling area currency(USD)
    - Object currency: Converted from Controlling area currency(USD) to Transaction currency(JPY)
    This means that if transaction currency is JPY100, object currency is NOT JPY100, right?
    (Because cross rate reference is deactivated and JPY->USD and USD->JPY is different rate)
    On the other hand, in FI(OB22), I setup Group Currency which is USD.
    In FI, translation logic can be selected whether "From transaction currency" or "From first local currency".
    But still different logic from CO side.
    So, it seems GL and Controlling area has different translation logic.
    How normally global companies manage it?
    Yoshi

    Dear Yoshi-san,
    For a global company, e.g. COMPANY ABC (head office in the USA) has different legal entities in different countries (Japan, Singapore) where the local currencies are different. In my opinion below set up is normal.
    Controlling Area ABC
                    Assuming all companies (ABC, DEF, GHI) are assigned to controlling area ABC.
    Controlling area currency (for controlling area ABC) : USD
    COMPANY ABC (Head office)
             Company code currency : USD
             Object currency : USD
        2. COMPANY DEF (Japan entity)
            Company code currency : JPY
            Object currency : JPY
        3. COMPANY GHI (Singapore entity)
            Company code currency : SGD
            Object currency : SGD
    Let say for Japan entity, a document of JPY100 (transaction currency or document currency) has been posted. The amount in company code currency is JPY100; amount in object currency will be JPY100; amount in controlling area currency will be USD0.99. (maintain exchange rate table: 100 JPY = 0.99 USD).
    If another document of EUR100 (transaction currency or document currency) has been posted. Then the amount in company code currency is JPY 13,800 (maintain exchange rate table 1 EUR = 138 JPY; amount in object currency will be JPY 13,800; amount in controlling area currency will be USD136. (maintain exchange rate table: 1 EUR = 1.36 USD).
    As for the third document of USD100 (transaction currency or document currency) has been posted. Then the amount in company code currency is JPY 10,100 (maintain exchange rate table 1 USD = 101 JPY; amount in object currency will be JPY 10,100; amount in controlling area currency will also be USD100, no translation is required.
    Kind regards,
    John Chin

  • Transformation logic for a scenario

    Hi Experts,
    Our scenario is:
    In DSO1 against doc A, there will be doc's created like B,C,D.
    Of these B,C,D only one will have follow up doc say E( E should be looked up on DSO2 with a common doc) created against doc C.
    A -> B
    A          -> C - > E
    A         - > D
    Requirement:
    If any of the 3 i.e. B, C or D have a followup doc i.e. E, then populate a constant say 'W' to Info object 'Status' against all the 3 i.e.  B, C, D.
    No rule that A, B, C, D, E will be created on the same day.
    Can anyone please let me know how to achive this through a transformation from DSO1 where all the B,C,D will be there and the E should be looked up on other DSO2.
    Thanks & Regards,
    Bhadri M.

    Hi,
    First let me mention what I actually understood from your example. Your DSO1 has fields as below
    Document1----Document2 Status
    A----
    B
    A----
    C
    A----
    D
    Similarly DSO2 has fields
    Document1----Document2
    C----
    E
    What you want in DSO1 is below
    Document1----Document2 Status
    A--B--
    W
    A--C--
    W
    A--D--
    W
    If this is the case, it can be achieved but would be a bit complex.
    Hopefully your datasource that fetches the data to DSO1 is a z datasource. Change your datasource logic in such a way that if for a document A followup document B is created on a day, it fetches the records for all the follow up documents that have been created for document A till date. i.e. A->B, A->C, A->D. This way the data coming to PSA would contain all the documents that have been created for A till date. Next, in the start routine you need to write routine to do following:
    Select * from DSO2 for all entries in source_package where document2 of DSO1 = document 1 of DSO2 into INT1.
    Then loop on source_package. Read from int1 for the document that are in source_package as follow up document. (i.e. to see whether C exists in int1). Once you find that a followup document exists in INT1, copy the main document from source_package into another internal table INT2 (copy A into it). This way in the loop you can collect all the main documents that have followup documents in DSO2.
    After the loop ends, start the loop on source_package again.
    Now, read whether document1 from source_package (A) exists in INT2 or not. If yes, update status as 'W'.
    Thats it.

  • Logic for retreiving the values from a dynamic internal table

    Hi all,
    I have an issue with the logic to fetch data from a dynamic internal table into fields. let me give you guys an example the sy-tfill = 9 (subject to vary). I need to populate the fields.
    Regards,
    Sukumar

    Hi,
    this is for printing out the info in a dynamic table,
    it will work likewise to insert data.
    PARAMETERS:
      p_tabnam TYPE tabname DEFAULT 'DB_TABLE_NAME'.
    DATA:
      lv_dref TYPE REF TO data,
      ls_dd03l LIKE dd03l,
      lt_fieldname TYPE TABLE OF fieldname,
      ls_fieldname TYPE fieldname.
    FIELD-SYMBOLS:
      <fs> TYPE STANDARD TABLE,
      <wa_comp> TYPE fieldname,
      <wa_data> TYPE ANY,
      <wa_field> TYPE ANY.
    REFRESH lt_fieldname.
    SELECT * FROM dd03l INTO ls_dd03l
                       WHERE as4local = 'A'
                         AND as4vers  = '0000'
                         AND tabname  = p_tabnam
                       ORDER BY position.
      ls_fieldname = ls_dd03l-fieldname.
      APPEND ls_fieldname TO lt_fieldname.
    ENDSELECT.
    IF NOT ( lt_fieldname[] IS INITIAL ).
      CREATE DATA lv_dref TYPE TABLE OF (p_tabnam).
      ASSIGN lv_dref->* TO <fs>.
      SELECT * FROM (p_tabnam) INTO TABLE <fs>.
      WRITE:
        / 'CONTENTS OF TABLE', p_tabnam.
      LOOP AT <fs> ASSIGNING <wa_data>.
        SKIP.
        WRITE:
          / 'LINE', sy-tabix.
        ULINE.
        LOOP AT lt_fieldname ASSIGNING <wa_comp>.
          ASSIGN COMPONENT sy-tabix OF STRUCTURE <wa_data> TO <wa_field>.
          WRITE:
            / <wa_comp>, <wa_field>.
        ENDLOOP.
      ENDLOOP.
    ENDIF.
    grtz
    Koen

  • New iPad or Transformer Infinity for reading and annotatingPdf's?

    Hello Everyone,
    I played with the New iPad today in a store and at least in iBooks, the Pdf files load surprisingly slowly from one page to the other (1 second, not THAT slow). I don't yet own a tablet but I want one for annotating PDF files (orchestra scores) when I'm on the road. When I need to be turning pages quickly, I don't want to have to wait forever for the page to sharpen, especially since it will probably be quite heavier with all my notes on the page... Would a Nexus Transformer Prime or the Transformer Infinity be better suited to my needs, i.e. would it be loading faster with annotated PDF files? I understand this is mainly a CPU issue, and Nexus has more powerful ones. Am I right?
    Thanks for your help!

    There are a lot of other pfs readers for the iPad.
    http://www.labnol.org/software/ipad-pdf-reader-apps/13807/
    good reader is one.
    Hands free page turning.
    http://airturn.com/bt-105/
    http://www.youtube.com/watch?v=80OVgx5Q7T4
    Robert

  • Need to apply MTD logic for actual and prior year

    Hi Team,
    We are facing an issue with MTD calculation based on formula variables,
    Issue: MTD Only (Actual and Prior Year) – show the same value for MTD. 
    Requirement:
    When the query is run for a selected month, the query should be run for that month based on the Last TECO Date (Input Variable).
    • For example, dashboard is run for January 2014, the calculation of MTD should be based on the Last TECO Date : i.,e if I enter older date/month (not current date)[ 1/20/2014 means it should calculate for 1/1/2014 – 1/31/2014]
    • If the dashboard is run for the current month, the MTD calculation should be on Last TECO Date = 1st day of month – Current Day
    The above same calculation logic should be applicable for MTD prior year: instead of the input variable value, the value for previous year should be considered
    Final Calculation:
    If Last Labour Date is blank, 0, else Last TECO Date – Last Labour Date (by individual line item)
    Here we created two formula variables for last teco date and last labour date.
    Also created 2 Customer Exit variable for MTD Actual calculation.
    We have pulled the formula variable and CE variable into the selection and created a new CKF (Teco date).
    We have done the final calculation on the basis of the ( (labour date ==0)*0 + last teco date-last labour date )for MTD ACTUAL.
    MTD Prior Year-
    when have done calculation on the basis of the( (labour date ==0)*0 + last teco date-last labour date )for MTD Prior year by giving offset -12 to the value.
    Here for MTD Actual and MTD Prior year values are coming same  ,we tried with by doing customer exit variables as well, Still values are coming same.
    please give your valuable inputs/suggestions to proceed for this requirement .

    I assume the Sales measure is a monthly measure. The Prior year calculation as I described will give you the sales for the same month in the previous year when selecting month in your report. When selecting year it will add up all months of the prior year until now and you will indeed see the Prior year YTD.
    If you want to see the total sales for the previous year you have to use the same calculation as 'Prior year' and map it to the year level of your time dimension. This will work as a 'partition by year'.

  • Mapping Logic for matching the qualifier from different segments

    Hi all,
    I have the following input file and i would like to have the output in mentioned way.
    I have to look for the STOP from delivery segment, if that matches to the STOP in the E1ADRM4 segment means i have to take that date.
    For the first STOP in the delivery segment is getting the correct date from the E1ADRM4 segment but for the second onwards it is not giving in the uotput.
    Please suggest me how to achieve this
    Input file
      <E1EDT20>
          <E1ADRM4>
             <STOP>01</PARTNER_Q>
             <PARTNER_ID>1108</PARTNER_ID>
             <APNTD>20101115</APNTD>
          </E1ADRM4>
        <E1ADRM4>
             <STOP>10</PARTNER_Q>
             <PARTNER_ID>1115</PARTNER_ID>
             <APNTD>20101130</APNTD>
          </E1ADRM4>
        </E1EDT20>
       <E1EDL20>
          <VBELN>0085001387</VBELN>
          <ZATEDLVINFO>
             <STOP>10</STOP>
             <PRO>do not update</PRO>
             <CONTAINER>do not update</CONTAINER>
             <VESSEL>do not update</VESSEL>
           </ZATEDLVINFO>
       </E1EDL20>
      <E1EDL20>
          <VBELN>0085009999</VBELN>
          <ZATEDLVINFO>
             <STOP>01</STOP>
             <PRO>do not update</PRO>
             <CONTAINER>do not update</CONTAINER>
             <VESSEL>do not update</VESSEL>
           </ZATEDLVINFO>
       </E1EDL20>
    Output
    <E1EDL20 SEGMENT="1">
                <VBELN>0085001387</VBELN>
                <ZATEDLVINFO SEGMENT="1">
                   <STOP>10</STOP>
                   <PRO>do not update</PRO>
                   <CONTAINER>do not update</CONTAINER>
                   <VESSEL>do not update</VESSEL>
                   <APNTD>20101130</APNTD>
                </ZATEDLVINFO>
             </E1EDL20>
    <E1EDL20 SEGMENT="1">
                <VBELN>0085009999</VBELN>
                <ZATEDLVINFO SEGMENT="1">
                   <STOP>01</STOP>
                   <PRO>do not update</PRO>
                   <CONTAINER>do not update</CONTAINER>
                   <VESSEL>do not update</VESSEL>
                  <APNTD>20101115</APNTD>
                </ZATEDLVINFO>
             </E1EDL20>
    Regards

    Hi,
    Try this UDF:
    Execution type: All values of a context.
    Input : var1, avr2, var3.
    int a=var1.length;
    int b= var2.length;
    for(int i=0;i<a;i++)
    for(int j=0;j<b;j++)
    if(var2[j].equals(var1<i>))
    result.addValue(var3[j]);
    Mapping:
    STOP(ZATEDLVINFO)
    STOP(E1ADRM4)--UDF-SplitByvalue---APNTD
    APNTD(E1ADRM4)
    NOTE: Change the context of all the 3 input fields(set the context to its message type name).
    Thanks
    Amit

  • Need conversion logic for the xml sending from legacy system ...!!!

    Hi Experts ,
    we have one requirement where in the legacy system ( Sender system) is sending .xml  file and PI needs to pick the file and send it to ECC Via IDOC AAE Receiver Adapter  to R/3  (SAP ECC) System .
    The problem is the  .xml file which PI receives is in a different format  which is shown below
    </tns:Header>
        <tns:Body>
            <esa:Payload>
                <esa:Header>
                    <PayloadName></PayloadName>
                    <PayloadVersion>1.0</PayloadVersion>
                    <PayloadCreated>2014-01-07T02:39:55.793Z</PayloadCreated>
                    <PayloadSize units="Bytes">432</PayloadSize>
                </esa:Header>
                <esa:Data>
                    <zcs:HUM xmlns:zcs="com.">
                        <Hum_Number>00393155965135748871</Hum_Number>
                        <Source_Storage_Location>9000</Source_Storage_Location>
                        <Destination_Storage_Location>0100</Destination_Storage_Location>
                        <Material_Number>000000000000004123</Material_Number>
                        <Batch_Number>321940071 </Batch_Number>
                        <Quantity>0000000096000</Quantity>
                        <Production_Version>A100</Production_Version>
                        <Hostname>POSPI000003</Hostname>
                    </zcs:HUM>
                </esa:Data>
            </esa:Payload>
        </tns:Body>
    </tns:Envelope>|]
    need help to  convert this  .xml into PI Standard xml  format ( i mean without esa,zcs, ..So that at PI the message gets passed  successfully without throwing  xml parser issue or xml well not formed error .)  .
    do i need to write any java code for this ?
    Please experts needs your suggestions here .
    regards,
    khan ,

    Hi Aziz,
    please make sure your pasted xml has  start and end tags
    <tns:Body> </tns:Body>.
    i don't think you need to change the external definition. make sure that you use xslt mapping first then message mapping.
    Regards,
    Muni.

  • Tables for char and attributes

    Hi ,
    How to identify that a characteristic for an inspection lot has long term characteristics ?
    We want to retrieve the characteristic attributes for qualitative characteristics . What is the table link for the qualitative characteristics and the characteristic attributes ?

    Check following thread--->
    Control Indicator field in Inspection plan characteristics
    for tables
    Results Recording QAKL Results table for value classes
    Results Recording QAMR Characteristic results during inspection processing
    check bapi
    BAPI_INSPCHAR_GETRESULT
    Edited by: Sujit Gujar on Mar 15, 2011 9:35 PM

  • Simple command  for char and number

    how i can ask;
    i want  to know if the field contain numbers,or alphanumeric..
    if zdata is alpha or
    if zdate is alphanumeric....

    DATA a TYPE string VALUE 'abcd'.
    DATA b TYPE string VALUE '12345ab'.
    IF a  CA 'abcdefghijklmnopqrstuvwxyz'.
      MESSAGE 'only alphabets' TYPE 'I'.
    ELSE.
      MESSAGE 'alphanumeric' TYPE 'I'.
    ENDIF.

Maybe you are looking for

  • Problems in Pro*C installation for Oracle 8.1.7 on Redhat Linux 6.2

    Hello, I have installed Oracle 8.1.7 on Redhat Linux 6.2. After installing Proc and doing necessary changes in pcscfg.cfg file. I tried to compile demo programs. But I am getting following error while compiling. Syntax error at line 368, column 19, f

  • New to Dialog Boxes and I'm stuck

    Seems there is no way to format a field in a Dialog Box. How can a field be checked to determine if the correct type of data was entered in it, when the user moves to the next field, before data is committed? I am stuck on how to limit the number of

  • No sound in Safari & Firefox

    The sound in Safari & Firefox has stopped working. I usually watch streaming video and it stopped working this morning. Worked yesterday.... Oddly, it does work at the Apple website. I do have sound in iTunes. I've tried turning sound on and off at t

  • Provide Tooltip to a table control field

    Hi there, can anyone suggest me how do I provide tooltip to a field in a table control created thru screen painter? the field is of char4 type and it displays an ICON depending upon the business logic. Regards, Deb.

  • LSMW FOR UPLOADING INFO RECORD-ME11 USING RECODING METHOD

    Hi Guys, I am uploading inforecords using LSMW recording method. While recording I entered net price and other things properly. When I am running the lsmw, net price filed becoming gray(Non editable) filed and system shows error as "it is not an inpu