Can the case be changed while uploading the data or after uploading ????

hi all ,
can u pls help me ???  can the case of the data in a itab be changed while running the program? the data is uploaded to an internal table and then based on loop at that itab the conditions will evaluate to give the result.... but the problem is like wen the data is given in small letters the worste(last)  condition is executing even the data satisfies the condtion which is not supposed to happen. this is due to case sensitive problem ...can u pls help me ....can the case be changed while uploading the data r after uploading ????

this is the itab declaration ..
data: begin of it_input occurs 0 ,
       tra          like tstc-tcode,
     end of it_input.
and then from the uploaded data the prog should check wheather it is having any userexits or not ...
here comes the code...
sort it_input by tra.
delete adjacent duplicates from it_input  .
loop at it_input.
           it_itab-sno = sy-tabix.
  select single * from tstc where tcode eq it_input-tra.
if sy-subrc eq 0.
    select single devclass from tadir into v_devclass
             where pgmid = 'R3TR'
                    and object = 'PROG'
                    and obj_name = tstc-pgmna.
         if sy-subrc ne 0.
         select single * from trdir where name = tstc-pgmna.
         if trdir-subc eq 'F'.
            select single * from tfdir where pname = tstc-pgmna.
            select single * from enlfdir where funcname =
            tfdir-funcname.
            select single * from tadir where pgmid = 'R3TR'
                               and object = 'FUGR'
                               and obj_name eq enlfdir-area.
         move : tadir-devclass to v_devclass.
          endif.
       endif.
       select * from tadir into table jtab
                     where pgmid = 'R3TR'
                       and object = 'SMOD'
                       and devclass = v_devclass.
       if sy-subrc = 0.
        select single * from tstct where sprsl eq sy-langu and
                                        tcode eq it_input-tra.
                  if not jtab[] is initial.
           loop at jtab.
                select single modtext from modsapt  into str
                     where sprsl = sy-langu and
                            name = jtab-obj_name.
                it_itab-tra        = it_input-tra.
                it_itab-i_obj_name = jtab-obj_name.
                it_itab-i_modtext = str.
                append it_itab.
                str = ''.
           endloop.
          endif.
        else.
                it_itab-tra        = it_input-tra .
                it_itab-i_obj_name = ' '.
                it_itab-i_modtext = 'No user Exit exists'.
                 append it_itab.
        endif.
      else.
                it_itab-tra        = it_input-tra .
                it_itab-i_obj_name = ' '.
                it_itab-i_modtext = 'Transaction Code Does Not Exist'.
                 append it_itab.
      endif.
endloop.

Similar Messages

  • How to change the Data sources after deploying the application ??

    Hi All,
    i want to know how to change the Data sources after deploying the application to the application server ???
    I'm using Oracle Application Server 10g Release 3 (10.1.3.1.0)

    Can you access the Enrprise Manager website of the target Application Server from your location? If so, you can change the datasource in it. If not, yo can bundle the datasource definition in your archive and use that one instead of the one configured in the target OC4J container. Or this will just be the responsability of your customer: whenever you send a new WAR file, they have to modify the datasource if needed and deploy the application?

  • Can the Date of the messages in Mail.app all appear as "date_sent"?

    I have an IMAP mail account, and it has been added into the Mail.app on my Mac (OS X Yosemite 10.10.2).
    As I known, a mail message has three "time stamps", which are date_sent, date_received, and date_internal.
    The first two time stamps(date_sent & date_received) are stored in the source of the mail message,
    and the last one(date_internal) is stored on the mail server.
    This date_internal is the time when this mail message is received on the server for the first time.
    Normally, date_received and date_internal are the same or very close to each other.
    However, when some old mail messages are synced back to the mail server, these two time stamps are quite different.
    For example, I miss-deleted some mail messages on the server, and I have the backup files on my Mac, so I can sync these messages back to the mail server through IMAP protocol.
    In the situation above, the date_received is still the origin received time because it is always stored in the source of the mail message, while the date_internal then becomes the time when I sync this mail message.
    Unfortunitly, Mai.app uses the date_internal as the default Date appearing in the list, not the date_received, not the date_sent, and even no settings about this can be found.
    Can the Mail.app shows the date_received instead of the date_internal in my mail list? Thank you so much!
    Btw, I have tried the AppleScript to change the dateReceived element, but the system said my operation is not allowed.

    Maybe I'm missing something...?
    In Mail.app -> View -> Columns you can add various columns to the view.
    Is that not what you want?

  • How long can the data in PSA be kept?

    Hello all,
    If a PSA is used for data staging, how long can/should the data be kept in it in a real-time scenario?
    Your help will be very much appreciated,
    points will be awarded too.
    Thanks
    S N

    HI ,
    It always depends on the Frequency of data load and the retention of the data in the system . Ideally we only keep 2 days of data in PSA in case of full load and then keep all the delta loads in the PSA including the Intialization Request in the PSA , so that any problem occurs we can load the data from PSA .
    But before all that u need to diccuss with ur basis team regarding space or table space . And according to that u need to keep the requests in System.
    Assign points if it helps....
    Regards,
    Vijay

  • Can the Date be Formatted in BPC? - Version: BPC Release 7.5.112.07/MS

    Hi BPC Community,
    We want to format a dimension property in BPC 7.5 to display the date as: Day-Month-Year, e.g., 8-Aug-14. In our existing database we have different formats such as: 01-Oct-2002 or 12/31/2014.
    Is there a way to set this property to display dates as: 01-Oct-2002 no matter what the actual input is?
    Thanks. Barry

    Hi Barry,
    not sure I've understand but if the data entry is full of evsnd instead of one or some evdres then this could be the cause, better convert the sheets with evdres.
    I don't know what "modify application with process application enabled," means. If you can provide the steps I can try this.
    go to admin console select the application and click on modify application (right in the action pane)
    select the application and click on the checkbox of process application and run it, see if there are errors, if ok retry to open the data entry.
    Regards
         Roberto

  • Data lost after 10 failed login attempts...Can the data be restored?

    Hello...
    Has anybody a solution to recover the data after 10 failed login attempts?
    My son played with my Iphone...and the data is gone...
    Thanks

    Yes, I have already followed this article.
    After changes in AD account, we need to follow this article.
    But, here my concern is that after password change prompts, it keeps retrying again and again for new password. It will cross the threshold limit which is set as 10. But the account doesn't gets locked.
    Thanks for the suggestion.
    Regards

  • How syncronize with the data base after pre-loading the data

    Hi,
    I have pre-loaded the data from the database table into the cache.
    If the key is not found in the cache i want to it to connect to database and get the value from the table. How to achieve this?

    Hi JK,
    I have pasted my cache loader code, config file and the main class in the other post but i m not sure what is the issue with it. Its not working. Please can you tell me what might be the issue with that piece of code. I m not getting any exception either but the load() or loadAll() method is not at all getting triggered on invoking cache.get() or cache.getAll() method. What might be the cause for this issue?
    Can you give me the coherence-cache-config.xml contents?
    I m not sure whether its the issue with the config file because i have read some where that refreshaheadfactor is required to trigger the loadAll() method.
    Edited by: 943300 on Jul 4, 2012 9:57 AM

  • What will happen to the data block after data offloading?

    HI All,
    Since offloading can filter out the necessary information such as unnecessary columns and rows before they are passed to the database, I want to know what will happen to the data block? Will some new blocks be built at this time,which only contains the useful information?
    Best regards,
    Leon

    Andy Klock wrote:
    The statement seems to imply that Exadata has the ability to strip out columns from a block, but a block is a block is a block. Offloading is remarkable at filtering out data in the storage layer that ultimately is not needed but if you only need one row in a block that has 100 rows in it, you still get the whole block (and all 100 rows) to be processed by Oracle. The columns portion of the statement is when HCC is used for a table allowing only the blocks containing the column data required for the query, and thus if a block has 1000 column values in it, it will pass all 1000 column values to the instance to be processed.These assertions are incorrect.
    Blocks sent to the database grid as a result of a Smart Scan contain only the necessary columns and rows for the db grid to do its processing (after filter restrictions and projection restrictions are applied). These Smart Scan blocks are created at run time by the storage server so they have no bearing on the blocks that reside physically on disk which is why they can not be reused by other queries via the SGA and are read directly into the PGA space.
    Regards,
    Greg Rahn | blog | twitter | linkedin

  • How does the data inserted after a delete command

    Hi Folks,
    I have a question regarding a scenario. I deleted some data from a table ( i use the where clause to delete a portion of rows from the table). So after the rows are deleted the space is free but below the high water mark. When i insert new rows in the same table, then where from oracle fetches the space:
    A. Will it use the space that is freed up by deleted the rows
    B. Use the space above the high water mark. ( If it uses this space then Lets say all the extents are completed and we dont have any extents above the high water mark, then will it use the space freed by the deleted rows or will it complain for the addiditional space)
    Please advice.
    Also I would appriciate if you could give me a link for a documentation for the above scenerio.
    Thanks in advance
    Karthik

    A. Will it use the space that is freed up by deleted
    the rows That depends, it's controled by PCTUSED and PCTFREE parameter as described below. Also the free space need to be big enought to hold the new row.
    The PCTUSED Parameter
    The PCTUSED parameter sets the minimum percentage of a block that can be used for row data plus overhead before new rows are added to the block. After a data block is filled to the limit determined by PCTFREE, Oracle considers the block unavailable for the insertion of new rows until the percentage of that block falls beneath the parameter PCTUSED. Until this value is achieved, Oracle uses the free space of the data block only for updates to rows already contained in the data block.
    B. Use the space above the high water mark. ( If it
    uses this space then Lets say all the extents are
    completed and we dont have any extents above the high
    water mark, then will it use the space freed by the
    deleted rows or will it complain for the addiditional
    space)
    When Extents Are Allocated
    When you create a table, Oracle allocates to the table's data segment an initial extent of a specified number of data blocks. Although no rows have been inserted yet, the Oracle data blocks that correspond to the initial extent are reserved for that table's rows.
    If the data blocks of a segment's initial extent become full and more space is required to hold new data, Oracle automatically allocates an incremental extent for that segment. An incremental extent is a subsequent extent of the same or greater size than the previously allocated extent in that segment.
    Please advice.
    Also I would appriciate if you could give me a link
    for a documentation for the above scenerio.
    check Oracle Concept for more detail,
    Data Blocks, Extents, and Segments

  • Dsc why the data lost after two days?

    hi:
    I have found some question about DSC module like the picture you see ,there are two lines. ONE is white ,the other is blue.The white one has lost at some place,but the blue one was continuous.
    As I know,the computer was crashed at August 25,but the data loss occurred in August 23.It's a long time.So can anybody tell me why it happened.
    Attachments:
    untitled.png ‏3915 KB

    What is the data? What is displaying the data (graph or chart)? What does your code look like? Why did you post a 4M image to the forum?
    NaN in the data will appear as gaps in plots.
    Mike...
    Certified Professional Instructor
    Certified LabVIEW Architect
    LabVIEW Champion
    "... after all, He's not a tame lion..."
    Be thinking ahead and mark your dance card for NI Week 2015 now: TS 6139 - Object Oriented First Steps

  • How to Troubleshoot why data is not moving over into the Data Warehouse after Sql Server Agent Job Run

    Hello,
    Here is my problem:
    Data was imported into the staging area. After resolving some errors and running the job, I got the data to move over to the next area. From there, data should be moving over into the DW.  I have been troubleshooting for hours and cannot reslove this
    issue. I have restarted the sql service services, I have ran a couple packages manually, and the job is running successfully. 
    What are some reasons why data is not getting into the data warehouse? Where should I be looking? 
    Your help is greatly appreciated!!

    Anything is possible.
    So, just to reiterate, running the job manually works, running the scheduled job does not result in errors neither data arriving to the DW, right? And it used to, correct?
    If so, the 1st step would be to examine the configuration(s). But not before you inspect the package. Do you have an ability to export it to a file system and open in BIDS?
    Arthur My Blog

  • Sort the data set after quering

    Hi
    I have a grid where I query and load the data.
    Is it possible to filter and show the same data set which I have got in the grid with a check box item.
    Some occasions needs to get input for the said filter from the first row of the grid.
    (Dev Forms Builder (9.0.4.2.0))
    rgds
    shabar

    Hi Navnit Punj/Danish
    I hav a following table
    clo1 col2 col3  col4
    12    po   56    Y
    13    ss   75    Y
    14    ty   40    N
    19    po   35    Y
    21    po   80    NWill say by normal query (F11 and after giving col2 value in the grid and ctrl + F11)I get the following data on the grid
    12    po   56    Y
    19    po   35    Y
    21    po   80    NWith this by clicking check box I want to get the records only having 'Y' for the col4.
    rgds
    shabar

  • How to undelete the data, deleted after revert to single disc

    There was just one drive in the media hub nmh405, so i have pressed revert to single disc without back up data before and  lost all data!
    Is there some options to restore the data?

    Hi zzitmanis,
    Too bad to hear that.
    Unfortunately, once you revert the disk to single disk mode, it will delete all the files in the drive because it will definitely reformat the drive. 

  • Can the Data Foundation Of Business View Manager Handle 100 tables or views

    Post Author: palm
    CA Forum: Crystal Reports
    Hi All,
    I am working on the DF which will take 100 Oracle Views , some may be tables in it
    Do any one know the limit of tables or views to include in DF  Or is DF can handle any number of tables or views in it that are joined in anyways?
    And one more , i had around 30 table in my DF , when i double click the table or view that are inserted in DF it will expand and minimize
    But i face this problem , when i expand a table by double clicking it in DF and then save that DF  , the BVM window closes
    Do any one know about this , please help me
    Thanks!

    Post your question to the BV forums: Semantic Layer

  • How can the first syncing after launching Contribute be sped up?

    So a user brings up Contribute and it hangs there 2 or 3 minutes syncing.  Is there anyway to speed this process up?

    There has never been a facetime app on the iphone.
    As the manual says, you access from contacts.  Pick a contact and scroll down to the facetime button

Maybe you are looking for

  • ITunes keeps trying to download a HD movie I don't want.

    My broadband provider (Qwest) has an issue with large files.  I'm not able to download any single file that is larger than 2 GB, I've called them multiple times on this issue and they say yes, it's real and it's a problem.  Any time I try and downloa

  • Iaik.security.ssl.SSLCertificateException - the mother of all errors

    Hi, We're experiencing this error: Error occurred while connecting to the FTP server "whatever:whichever": iaik.security.ssl.SSLCertificateException: Peer certificate rejected by ChainVerifier when connecting to the FTPS server. What was done by the

  • How to authenticate using Active directory!

    Hi all! at present im using a code given below, its working fine! currently we are using mixed mode active directory! we are going to migrate that to Native mode! import java.util.Properties; import javax.naming.*; import javax.naming.directory.*; im

  • The extraction program does not support object ZTD_AW_17

    Hi friends, When doing extraction I am getting the below error.Please see the below extract of system response. "The extraction program does not support object ZTD_AW_17 Message no. R3 009 Diagnosis The application program for the extraction of data

  • Unnecessary and incorrect thumb rebuild CS5 BR5

    I installed Xrite's colorchecker passport system and created a DNG test profile.  I created exactly one image in a test directory and applied the generated profile to that image.  This was successful. Subsequently, I opened two other directories with