(ABAP-HR) data retrieval

Hi Experts
How to retrieve the comments in infotype 0019 (Monitoring task)?
Thanks in advance

Hi,,
Infotype 0019 (Monitoring of Tasks) has 3 lines of comments that can be used to store comments regarding the tasks assigned to employee. This text is not directly stored in the database table PA0019. This data is stored in the u2018PCL1u2032 cluster with Cluster ID u2018TXu2019.
Since this text is stored in PCL1 cluster, READ_TEXT function module cannot be used to retrieve it. Instead, make use of one of the 2 ways mentioned below to retrieve this text.
Make use of IMPORT ptext FROM DATABASE pcl1(tx) command
Use RP-IMP-C1-TX macro stored in TRMAC table.
Sample program Using IMPORT statement
REPORT zhr_read_cluster.
TYPES:
BEGIN OF t_pa0019,
pernr TYPE persno,
subty TYPE subty,
objps TYPE objps,
sprps TYPE sprps,
endda TYPE endda,
begda TYPE begda,
seqnr TYPE seqnr,
itxex TYPE itxex,
END OF t_pa0019,
BEGIN OF t_text,
line(72),
END OF t_text.
DATA:
gt_pa0019 TYPE STANDARD TABLE OF t_pa0019,
gw_pa0019 TYPE t_pa0019,
ptext TYPE STANDARD TABLE OF t_text INITIAL SIZE 10,
gw_text TYPE t_text,
gw_key TYPE pskey.
SELECTION-SCREEN BEGIN OF BLOCK abc WITH FRAME TITLE text-001.
PARAMETERS:
p_pernr LIKE pernr-pernr,
p_date LIKE sy-datum.
SELECTION-SCREEN END OF BLOCK abc.
START-OF-SELECTION.
SELECT pernr subty objps sprps endda begda seqnr itxex
FROM pa0019
INTO TABLE gt_pa0019
WHERE pernr EQ p_pernr.
LOOP AT gt_pa0019 INTO gw_pa0019 WHERE itxex EQ u2018Xu2019.
MOVE-CORRESPONDING gw_pa0019 TO gw_key.
gw_key-infty = u20180019u2032.
IMPORT ptext FROM DATABASE pcl1(tx) ID gw_key.
LOOP AT ptext INTO gw_text.
WRITE: p_pernr, gw_text-line.
ENDLOOP.
ENDLOOP.
Sample Program using RP-IMP-C1-TX Macro
REPORT ZHUGE_LONGTEXTS .
TABLES: PCL1,PA0000,T582S.
DATA: TX-KEY LIKE PSKEY.
DATA: BEGIN OF TEXT-VERSION,
NUMMER TYPE X VALUE u201802u2032,
END OF TEXT-VERSION.
DATA: BEGIN OF PTEXT OCCURS 200.
DATA: LINE(78).
DATA: END OF PTEXT.
DATA: KEY1 LIKE PCL1-SRTFD.
DATA: KEY2 LIKE PCL1-SRTFD.
DATA: FINAL_KEY LIKE PCL1-SRTFD.
DATA: BLANK(7).
DATA: BLANK1(3).
PARAMETERS: PERNR LIKE PA0000-PERNR,
INFTY LIKE T582S-INFTY,
SUBTY LIKE PA0000-SUBTY,
ENDDA LIKE PA0000-ENDDA,
BEGDA LIKE PA0000-BEGDA.
IF SUBTY IS INITIAL.
CONCATENATE PERNR INFTY INTO KEY1.
CONCATENATE ENDDA BEGDA u2018000u2032 INTO KEY2.
CONCATENATE KEY1 KEY2 INTO FINAL_KEY SEPARATED BY BLANK.
ELSE.
*u2013 if with subtypes
CONCATENATE PERNR INFTY SUBTY INTO KEY1.
CONCATENATE ENDDA BEGDA u2018000u2032 INTO KEY2.
CONCATENATE KEY1 KEY2 INTO FINAL_KEY SEPARATED BY BLANK1.
ENDIF.
MOVE FINAL_KEY TO TX-KEY.
RP-IMP-C1-TX.
LOOP AT PTEXT.
WRITE:/ PTEXT.
ENDLOOP.
Hope it helps.
Regards
Hiren K.Chitalia

Similar Messages

  • SQ02 Data retrieval by program

    Hello,
    I would like create an abap query 'SQ01' using infoset 'SQ02' with the option
    'Data retrieval by program' and 'Integrated program'
    I wrote and tested this program:
    REPORT  ZONR_REPORT_MBEW.
    data:  mbew_tab like mbew OCCURS 0 WITH HEADER LINE.
    * <Query_head>
    select * from mbew into table mbew_tab
        where bwkey = 'MANT' and bklas = '3000' and matnr = '000000000001027712'.
        call function  'MBEW_EXTEND'
          exporting
            XVPER = 'X'
          tables
            mbew_tab = mbew_tab.
    loop at mbew_tab.
    * <Query_body>
      endloop.
    Could you help how to fill the area 'Data structure' of the option
    'Data retrieval by program' ?
    With the menu 'Goto' -> 'Code' -> 'Data readindg program'
    I found this template of program :
    REPORT  RSAQDVP_TEMPLATE .
    *   declarations
    *   (insert your declarations in this section)
    data:
      MBEW                           type MBEW                          ,
      it_data type standard table of MBEW                          .
    field-symbols: <struc> type MBEW                          .
    *   selection screen statements
    *   (define your selection-screen here)
    * !! the following comment MUST NOT BE CHANGED !!
    *<QUERY_HEAD>
    *   read data into IT_DATA
    *  (select your data here into internal table IT_DATA)
    *   output of the data
    *   (this section can be left unchanged)
    loop at it_data assigning <struc>.
      move-corresponding <struc> to MBEW                          .
    * !! the following comment MUST NOT BE CHANGED !!
    *<QUERY_BODY>
    endloop.
    Could you help me to customize the program ZONR_REPORT_MBEW according
    to the template, because I'm a neewbie in abap ?
    Thanks for your helping
    Marco

    The "Data retrieval program" would be coded like any normal report except for adding a few comment tags that will be used by SQ01 query generator as placeholders to insert its own code into your code to generate the query program.
    In general this is how a data retrieval program is coded. SQ01 when it generates the query will insert its own code in place of *<Query_body> and *<Query_head> comment tags
    REPORT ztest_sq01_driver_program.
    TABLES: <name of dictionary structure of your infoset>.
    * DATA declarations
    START-OF-SELECTION.
    *<Query_head>
    * <Fetch your data here and store it in an internal table gt_report> that has all the fields that you need in your query
    END-OF-SELECTION.
      LOOP AT gt_report INTO gs_report.
        MOVE-CORRESPONDING gs_report TO <name of dictionary structure of your infoset>
    *<Query_body>
      ENDLOOP.

  • BPC 10 - EPM data retrieval very slow!

    Hi BPCers,
    We are using an Excel EPM Input Schedules as a Resource Management tool - using VBA to provide the functionality we need.
    Performance is generally good, but quickly deteriorates when handling larger data sets - even 500-600 rows of transactional data is enough to slow data retrieval from our BPC Cube to EPM to an unusable speed. This is in relative terms pretty small so  there should be some option for optimisation.
    Does anybody have any experience with this? All suggestions welcome. We are operating on EPM Service Pack 7 Patch 1, but I'm not sure that EPM is necessarily the problem here.
    Thanks,
    Tom

    Thanks Gersh,
    Had a look through fiddler and have identified the job that is causing the delay - some rooting around in the ABAP debugger produced the answer as to why adding more data slows processing speed so dramatically.
    When we take data from the back end, we select a couple of parameters which limit the range of data that we are pulling through - a certain set of people, and a certain range of days. Once this is pulled through, allocations are made to any combination of person and day within this range, which generates and extra two properties - a project ID and a work status.
    This makes 4 properties, and when BPC pulls data it attempts to find every combination of every one of the properties that exist within this range - so the more allocations made the more this slows down as it dramatically increases the number of combinations.
    The result is that BPC runs through a couple of hundred thousand generated tables, most of which are nonsense.
    Not sure what to do from here. This is how BPC reads data so approaching a fix could be difficult.
    Tom

  • SQ02 - Create InfoSet using 'Data retrieval by program'

    Using SQ02, I would like to use the data returned by an ABAP program as the data source for an InfoSet.  When attempting to create an InfoSet, I noticed the 'Data retrieval by program' option.  I would like to discover how to use this option and how to pass parameters to the ABAP program I'd like to use.

    The "Data retrieval program" would be coded like any normal report except for adding a few comment tags that will be used by SQ01 query generator as placeholders to insert its own code into your code to generate the query program.
    In general this is how a data retrieval program is coded. SQ01 when it generates the query will insert its own code in place of *<Query_body> and *<Query_head> comment tags
    REPORT ztest_sq01_driver_program.
    TABLES: <name of dictionary structure of your infoset>.
    * DATA declarations
    START-OF-SELECTION.
    *<Query_head>
    * <Fetch your data here and store it in an internal table gt_report> that has all the fields that you need in your query
    END-OF-SELECTION.
      LOOP AT gt_report INTO gs_report.
        MOVE-CORRESPONDING gs_report TO <name of dictionary structure of your infoset>
    *<Query_body>
      ENDLOOP.

  • Query Error Information: Result set is too large; data retrieval ......

    Hi Experts,
    I got one problem with my query information. when Im executing my report and drill my info in my navigation panel, Instead of a table with values the message "Result set is too large; data retrieval restricted by configuration" appears. I already applied "Note 1127156 - Safety belt: Result set is too large". I imported Support Package 13 for SAP NetWeaver 7. 0 BI Java (BIIBC13_0.SCA / BIBASES13_0.SCA / BIWEBAPP13_0.SCA) and executed the program SAP_RSADMIN_MAINTAIN (in transaction SE38), with the object and the value like Note 1127156 says... but the problem still appears....
    what Should I be missing ??????  How can I fix this issue ????
    Thank you very much for helping me out..... (Any help would be rewarded)
    David Corté

    You may ask your basis guy to increase ESM buffer (rsdb/esm/buffersize_kb). Did you check the systems memory?
    Did you try to check the error dump using ST22 - Runtime error analysis?
    Edited by: ashok saha on Feb 27, 2008 10:27 PM

  • WAD : Result set is too large; data retrieval restricted by configuration

    Hi All,
    When trying to execute the web template by giving less restiction we are getting the below error :
    Result set is too large; data retrieval restricted by configuration
    Result set too large (758992 cells); data retrieval restricted by configuration (maximum = 500000 cells)
    But when we try to increase the number of restictions it is giving output. For example if we give fiscal period, company code ann Brand we are able to get output. But if we give fical period alone it it throwing the above error.
    Note : We are in SP18.
    Whether do we need to change some setting in configuration? If we yes where do we need to change or what else we need to do to remove this error
    Regards
    Karthik

    Hi Karthik,
    the standard setting for web templates is to display a maximum amount of 50.000 cells. The less you restrict your query the more data will be displayed in the report. If you want to display more than 50.000 cells the template will not be executed correctly.
    In general it is advisable to restrict the query as much as possible. The more data you display the worse your performance will be. If you have to display more data and you execute the query from query designer or if you use the standard template you can individually set the maximum amount of cells. This is described over  [here|Re: Bex Web 7.0 cells overflow].
    However I do not know if (and how) you can set the maximum amount of cells differently as a default setting for your template. This should be possible somehow I think, if you find a solution for this please let us know.
    Brgds,
    Marcel

  • Result set is too large; data retrieval restricted by configuration

    Hi,
    While executing query for a given period, 'Result set is too large; data retrieval restricted by configuration' message is getting displayed. I had searched in SDN and I had referred the following link:
    http://www.sdn.sap.com/irj/scn/index?rid=/library/uuid/d047e1a1-ad5d-2c10-5cb1-f4ff99fc63c4&overridelayout=true
    Steps followed:
    1) Transaction Code SE38
    2) In the program field, entered the report name SAP_RSADMIN_MAINTAIN and Executed.
    3) For OBJECT, entered the following parameters: BICS_DA_RESULT_SET_LIMIT_MAX
    4) For VALUE, entered the value for the size of the result set, and then executed the program:
    After the said steps, the below message is displayed:
    OLD SETTING:
    OBJECT =                                VALUE =
    UPDATE failed because there is no record
    OBJECT = BICS_DA_RESULT_SET_LIMIT_MAX
    Similar message is displayed for Object: BICS_DA_RESULT_SET_LIMIT_DEF.
    Please let me know as to how to proceed on this.
    Thanks in advance.

    Thanks for the reply!
    The objects are not available in the RSADMIN table.

  • Real tough data retrieval - assistance needed

    Late 2011 Macbook Pro with 500GB hard drive
    Lion 10.7
    One morning out of absolutely nowhere I get this grey screen with a flashing question mark folder. I take it to the geniuses at the Apple store and they tell me my hard drive has failed (no explanation). My mac is under warranty so they gave me a new hard drive for free, bagged my old hard drive and told me "good luck" retrieving the data.
    I'm on a mission to retrieve the data without paying for services. I have never retrieved data before but I've been doing a lot of forum reading and I have been getting protips from an IT friend who has saved my PC data before.
    As of now, I have been unable to even access the hard drive and so I am reaching out to the community to help me conquer this project.
    the problem is not the OS (according to Apple store)
    when hooking up with the dongle, neither Finder nor Disk Utility detect the bad drive
    the drive will spin when forced by the dongle (so I've ruled out the freezer method)
    I was advised (by friends and forums alike) to download so powerful data retrieval software:
    Data Rescue [did not detect bad external drive]
    Disk Drill [did not detect bad external drive]
    TestDisk (http://www.cgsecurity.org/wiki/TestDisk) [I can get it up and running but I have no idea how to use this software]
    So that seems to be the big problem, when I hook up my failed drive as an external hard drive, there is no acknowledgement from my MBP that it is connected and as such data recovery programs cannot access it. When I still had it installed in my computer, there was no clicking sound and it doesn't seem like any of the components are jammed up as it still spins.
    Where do I go from here? Please keep in mind that I am new to resolving my own technical problems but I'm willing to learn. Tired of being one of those people who look at computer parts and get anxious.
    NOTE: I haven't tried Target Mode as I do not have a firewire cable or access to another mac (yet). If you think I should try this, please let me know.

    A hard drive that will not divulge BOTH its Make&Model and a reasonable size/capacity to the likes of Disk Utility and data rescue programs has died, and connot be repaired with any software.
    Target Disk mode will not improve anything. The drive is read as a Mac Volume by software. If it won't mount under Mac OS X, it won't mount under Target Disk Mode.

  • Should I use a data retrieval service or software to recover data ?

    Please pardon the length of this post.
    Here's the setup for our office:
    Computer 1:
    10.4.8 OS X
    1 GHZ PowerPC G4: silver grey tower, grey apple
    1MB L3 Cache
    256 MB DDR SDRAM
    Computer 2:
    10.4.8 OS X
    Dual 450 MHZ PowerPC G4: blue grey tower, blue apple
    256 MB SDRAM
    Computer 3:
    10.4.8 OS X
    1 GHZ PowerPC G4 IMac Flat Screen:
    256 MB DDR SDRAM
    I have 2 LaCie Big Disk d2 Extremes daisy chained and connected to the IMac. We use the first to store all of our data to keep our local disks free. The second d2 is the backup to the first. The other 2 computers connect to the d2's via an ethernet hub. The d2's are each partitioned into 4 compartments.
    A couple of days ago I started the system up when I got in in the morning, and the main d2 would not open. I ran disk utility, but it said that the drive was damaged beyond it's ability to repair. I ran DiskWarrior, and it gave me this message:
    "The directory of disk 'G4' cannot be rebuilt. The disk was not modified. The original directory is too severely damaged. It appears another disk utility has erased critical directory information. (3004, 2176)."
    I contacted Disk Warrior tech support and after a series of exchanges that had me send him dated extracted from the terminal function, he said this:
    "It appears that the concatenated RAID inside your LaCie
    drive has failed (your 500GB drive is actually 2 250GB
    hard drives). That is why we only saw 2 partitions
    on "disk3". A possible cause could be a failed bridge
    in the case.
    You may be looking at sending this drive to a data recovery service.
    However, it is possible that we may be able to recover data
    from the partitions that we CAN see. What we would be doing would cause no damage to your data
    unless the hard drives were having a mechanical failure (ie, the
    head crashed and was contacting the platters, similar to scratching
    a record). But from what I've seen, I don't feel that is the case.
    I believe the piece of hardware that 'bridges' the two drives to
    make them act as one has failed. that's why we can only see data
    about 1 of the 2 drives in the case.
    We would only be attempting to gather data off the drive. Since
    data recovery services sometimes charge for amount of data retrieved,
    it's up to you how you want to proceed."
    Most of the data from the past 5 years for our business stands at being lost. Only some of it had been properly backed up on the second drive, due to some back up software issues. I want to do whatever I can to retrieve all of, or at least some of the data. From what the Alsoft technician said, do you think that the data recovery software available to the consumer is going to be robust enough to retrieve at least the data from the one disk in the drive that is recognizable (there are 2 250gig disks in the d2X. Only one is responding at all). If so, do these Softwares further damage the disks? Or should I just send the drive to a data recovery service?
    I'd like to try to extract some of it myself via over the counter retrieval software, but I don't know whether to trust these programs?
    Any advice would be greatly appreciated.
    Thanks in advance.
    Peter McConnell
    1 GHZ PowerPC G4 IMac Flat Screen   Mac OS X (10.4.8)   Posted

    Peter
    My 2 cents:
    I have used FileSalvage
    http://www.subrosasoft.com/OSXSoftware/index.php?mainpage=product_info&productsid=1
    to recover files from damaged disks. It works as advertised, within limits. Some files may be too damaged to recover. More importantly yo get to scan the disk before actually recovering, and it will give you a list of what it thinks it can recover.
    My experience was that it recovered approx 85% of the data.
    YMMV but they do have a trial.
    Regards
    TD

  • Report Developed in Webi Rich Client Consuming more time in Data Retrieval

    Dear All,
    I am a BO Consultant, recently in my project I have developed one report in Webi Rich Client., at the time of development and subsequent days the report was working fine (taking Data Retrieval time less than 1 minute), but after some days its taking much time (increasing day by day and now its taking more than 11 minutes).
    Can anybody point out what could be the reason?????
    We are using,
    1. SAP BI 7.0
    2. SAP BO XI 3.1 Edge
    3. Webi Rich Client Version :12.3.0 and Build 601
    This report is made on a Multiprovider (Sales).
    What are the important points that should be considered so that we can improve the performance of Webi Reports????
    Waiting for a suitable solution.....................
    Regards,
    Arun Krishnan.G
    SAP BO Consultant
    Edited by: ArunKG on Oct 11, 2011 3:50 PM

    Hi,
    Please come back here with a copy/paste of the 2 MDX statements from the MDA.log to compare the good/bad runtimes.
    & the 2 equivalent DPCOMMANDS clauses (good and bad) from the WebI trace logs.
    Can u explain what u really mean in the bold text above..................Actually I didn't get you..........
    Pardon, I have only 3 months experience in BO.
    Regards,
    Arun
    Edited by: ArunKG on Oct 11, 2011 4:28 PM

  • How we can see the abap memory data

    How we can see the abap-memory data
    fine the code below
    import lsind
             report_title
             table_name
             report_field
             change_display
             show_hide
             conversion_exits
             table_description
             form_program
             select_form
             update_form
             line_size
             line_count
             records[]
             fields[]
             header_fields[]
             select_fields[]
             xrep[]
             from memory id 'LZUT5U11'.
    Regards
    santhosh
    mail-id : [email protected]

    Dear Santosh,
    ABAP MEMORY:
    A logical memory model illustrates how the main memory is distributed from the view of executable programs. A distinction is made here between external sessions and internal sessions .
    An external session is usually linked to an R/3 window. You can create an external session by choosing System/Create session, or by entering /o in the command field. An external session is broken down further into internal sessions. Program data is only visible within an internal session. Each external session can include up to 20 internal sessions (stacks).
    Every program you start runs in an internal session.
    All "squares" with rounded "corners" displayed in the status diagram represent a set of data objects in the main memory.
    The data in the main memory is only visible to the program concerned.
    CALL TRANSACTION and SUBMIT AND RETURN open a new internal session that forms a new program context. The internal sessions in an external session form a memory stack. The new session is added to the top of the stack.
    When a program has finished running, the top internal session in the stack is removed, and the calling program resumes processing.
    The same occurs when the system processes a LEAVE PROGRAM statement.
    LEAVE TO TRANSACTION removes all internal sessions from the stack and opens a new one containing the program context of the calling program.
    The ABAP memory is initialized after the program is called. In other words, you cannot transfer any data to a program called with LEAVE TO TRANSACTION via the ABAP memory.
    SUBMIT replaces the internal session of the program performing the call with the internal session of the program that has been called. The new internal session contains the program context of the called program with which it is performed.
    When a function module is called, the following steps are executed:
    A check is made to establish whether your program has called a function module of the same function group previously.
    If this is not the case, the system loads the associated function group to the internal session of the calling program as an additional program group. This initializes its global data.
    If your program used a function module of the same function group before the current call, the function module that you have called up at present can access the global data of the function group. The function group is not reloaded.
    Within the internal session, all of the function modules that you call from the same group access the global data of that group.
    If, in a new internal session, you call a function module from the same function group as in internal session 1, a new set of global data is initialized for the second internal session. This means that the data accessed by function modules called in session 2 may be different from that accessed by the function modules in session 1.
    You can call function modules asynchronously as well as synchronously. To do so, you must extend the function module call using the addition STARTING NEW TASK ''. Here, '' is a symbolic name in the calling program that identifies the external session, in which the called program is executed.
    Function modules that you call using the addition STARTING NEW TASK '' are executed independently of the calling program. The calling program is not interrupted.
    To make function modules available for local asynchronous calls, you must identify them as executable remotely (processing type: Remote-enabled module).
    There are various ways of transferring data between programs that are running in different program contexts (internal sessions). You can use:
    (1) The interface of the called program (standard selection screen, or interface of a
    subroutine, function module, or dialog module)
    (2) ABAP memory
    (3) SAP memory
    (4) Database tables
    (5) Local files on your presentation server.
    For further information about transferring data between an ABAP program and your presentation server, refer to the documentation for the function modules WS_UPLOAD and WS_DOWNLOAD.
    Function modules have an interface, which you can use to pass data between the calling program and the function module itself (there is also a comparable mechanism for ABAP subroutines). If a function module supports RFC, certain restrictions apply to its interface.
    If you are calling an ABAP program that has a standard selection screen, you can pass values to the input fields. There are two options here:
    By using a variant of the standard selection screen in the program call
    By passing actual values for the input fields in the program call
    If you want to call a report program without displaying its selection screen (default setting), but still want to pass values to its input fields, there is a variety of techniques that you can use.
    The WITH addition allows you to assign values to the parameters and select-options fields on the standard selection screen.
    If the selection screen is to be displayed when the program is called, use the addition: VIA SELECTION-SCREEN.
    Use the pattern button in the ABAP Editor to insert a program call via SUBMIT. The structure shows you the names of data objects that you can complete with the standard selection screen.
    For further information on working with variants and further syntax variants for the WITH addition, see the key word documentation in the ABAP Editor for SUBMIT.
    You can use SAP memory and ABAP memory to pass data between different programs.
    The SAP memory is a user-specific memory area for storing field values. It is available in all of the open sessions in a user's terminal session, and is reset when the terminal session ends. You can use its contents as default values for screen fields. All external sessions can access SAP memory. This means that it is only of limited use for passing data between internal sessions.
    The ABAP memory is also user-specific, and is local to each external session. You can use it to pass any ABAP variables (fields, structures, internal tables, complex objects) between the internal sessions of a single external session.
    Each external session has its own ABAP memory. When you end an external session (/i in the command field), the corresponding ABAP memory is released automatically.
    To copy a set of ABAP variables and their current values (data cluster) to the ABAP memory, use the EXPORT TO MEMORY ID statement. The (up to 32 characters) is used to identify the different data clusters.
    If you repeat an EXPORT TO MEMORY ID statement to an existing data cluster, the new data overwrites the old.
    To copy data from ABAP memory to the corresponding fields of an ABAP program, use the IMPORT FROM MEMORY ID statement.
    The fields, structures, internal tables, and complex objects in a data cluster in ABAP memory must be declared identically in both the program from which you exported the data and the program into which you import it.
    To release a data cluster, use the FREE MEMORY ID statement.
    You can import just parts of a data cluster with IMPORT, since the objects are named in the cluster.
    In the SAP memory, you can define memory areas (SET/GET parameters, or parameter IDs), which you can then address by a name of up to 20 characters.
    You can fill these memory areas either using the contents of input/output fields on screens, or using the ABAP statement:
    SET PARAMETER ID '' FIELD .
    The memory area with the name now has the value .
    You can use the contents of a memory area to display a default value in an input field on a screen.
    You can also read the memory areas from the SAP memory using the ABAP statement GET PARAMETER ID FIELD . The field then contains the value from parameter .
    The link between an input/output field and a memory area in SAP memory is inherited from the data element on which the field is based. You can enable the set parameter or get parameter attributes in the input/output field attributes.
    Once you have set the Set parameter attribute for an input/output field, you can fill it with default values from SAP memory. This is particularly useful for transactions that you call from another program without displaying the initial screen. For this purpose, you must activate the Set parameter functionality for the input fields of the first screen of the transaction.
    You can:
    (1) Copy the data that is to be used for the first screen of the transaction to be called to the parameter ID in the SAP memory. To do so, use the statement SET PARAMETER immediately before calling the transaction.
    (2) Start the transaction using CALL TRANSACTION or LEAVE TO
    TRANSACTION . If you do not want to display the initial screen, use the AND
    SKIP FIRST SCREEN addition.
    (3) The system program that starts the transaction fills the input fields that do not already have default values and for which the Get parameter attribute has been set with values from SAP memory.
    The Technical information for the input fields in the transaction you want to call contains the names of the parameter IDs that you need to use.
    Parameter IDs should be entered in table TPARA. This happens automatically if you create them via the Object navigator.
    Programs that you call using the statements SUBMIT , LEAVE TO TRANSACTION , SUBMIT AND RETURN, or CALL TRANSACTION run in their own SAP LUW, and update requests receive their own update key.
    When you use SUBMIT and LEAVE TO TRANSACTION , the SAP LUW of the calling program ends. If no COMMIT WORK statement occurred before the program call, the update requests in the log table remain incomplete and cannot be processed. They can no longer be executed. The same applies to inline changes that you make using PERFORM &#8230; ON COMMIT.
    Data that you have written to the database using inline changes is committed the next time a new screen is displayed.
    If you use SUBMIT AND RETURN or CALL TRANSACTION to insert a program and then return to the calling program, the SAP LUW of the calling program is resumed when the called program ends. The LUW processing of calling and called programs is independent.
    In other words, inline changes are committed the next time a new screen is displayed. Update requests and calls using PERFORM ... ON COMMIT require an independent COMMIT WORK statement in the SAP LUW in which they are running.
    Function modules run in the same SAP LUW as the program that calls them.
    If you call transactions with nested calls, each transaction needs its own COMMIT WORK, since each transaction maps its own SAP LUW.
    The same applies to calling executable programs, which are called using SUBMIT AND RETURN.
    The statement CALL TRANSACTION allows you to
    Shorten the user dialog when calling using CALL TRANSACTION USING .
    Determine the type of update (asynchronous, local, or synchronous) for the transaction called. For this purpose, use the addition CALL TRANSACTION USING UPDATE 'update_mode', where update_mode can have the values a (asynchronous), L (local), or S (synchronous).
    Combining the two options enables you to call several transactions in sequence (logical chain), to reduce their screen sequence, and to postpone processing of the SAP LUW 2 until processing of the SAP LUW 1 has been completed.
    When you call a function module asynchronously using the CALL FUNCTION STARTING NEW TASK ' ' statement, it runs in its own SAP LUW.
    Programs that are executed with a SUBMIT AND RETURN or CALL
    TRANSACTION statement starts their own LUW processing. You can use these to perform nested (complex) LUW processing.
    You can use function modules as modularization units within an SAP LUW.
    Function modules that are called asynchronously are suitable for programs that allow parallel processing of some of their components.
    All techniques are suitable for including programs with purely display functions.
    Note that a function module called with CALL FUNCTION STARTING NEW TASK is executed as a new logon. It, therefore, sees a separate SAP memory area. You can use the interface of the function module for data transfers.
    Example: In your program, you want to call a display transaction that is displayed in a separate window (amodal). To do so, you encapsulate the transaction call in a function module, which you set as to Remote-enabled module. You use the function module interface to accept values that you write to the SAP memory. You then call up the transaction in the function module using CALL TRANSACTION AND SKIP FIRST SCREEN. You call the function module itself asynchronously.
    Type &#8216;E' locks for nested program calls may be requested more than once from the same object. This behavior can be described as follows:
    Lock entries from function modules called synchronously increment the cumulative counter, And are therefore successful.
    Lock entries from programs called with CALL TRANSACTION or SUBMIT
    AND
    RETURN is refused. The object to be locked by the called program is displayed as already Locked by another user.
    Programs that you call using SUBMIT or LEAVE TO TRANSACTION cannot come into conflict with lock entries from the calling program, since the old program ends when the call is made. When a program ends, the system deletes all of the lock entries that it had set.
    Lock requests belonging to the same user from different R/3 windows or logons are treated as lock requests from other users.
    Regards,
    Rajesh.
    Please reward points if found helpful.

  • Data retrieval for Sony handycam with 60G harddrive.

    Does BestBuy do data retrieval?  I have a handycam with work video on it that accidentaly got formated.  I know with a standard harddrive you can still retrieve it if there has not been recording since the formatting but I am not sure about camera harddrives.  If BestBuy does not do it, does anyone know where it can be done and approximate cost.  I live in the New Orleans area.  
    I may try the same question in computers if nobody knows.  Thanks in advance for any guidance at all. 

    Best Buy sends out for data retrieval. You can have it sent out through geek squad to get the estimate of cost.
    Crystal
    Superuser
    Forum Guidelines | Terms & Conditions | Community Guidelines | What is a Superuser?
    *Remember to mark your questions solved and click the star to give kudos to show your thanks!*
    While I used to be a Best Buy Employee, I no longer have any affiliation with Best Buy.
    My opinions do not in any way shape or form represent Best Buy's Official decisions.

  • Data Retrieval Speed in Oracle Spatial vs. ESRI ArcSDE

    I would appreciate any opinions regarding data retrieval
    performance between Oracle Spatial and ESRI ArcSDE. Would an end-
    user (using ESRI software) experience significant differences in
    data retrieval speed depending on how the data were stored in
    Oracle (MDSYS.SDO_GEOMETRY verses ESRI Binary/Blob formats).
    Knowing that the ESRI binary formats are tailored to their
    software front-end apps (ArcGIS, ArcMap, ArcCatalog, and
    ArcInfo), wouldn't this be a "non-issue" until the spatial
    dataset gets "large", and even then, wouldn't performance be
    (almost) equal if the spatial indexes were created properly?
    Thanks for your inputs,
    Bruce

    John,
    You can't do that type of query in sql from sql*plus using
    SDEBINARY. HOwever, you can perform spatial queries in ArcMap
    if you are using SDEBINARY.
    You can use the query builder to perform point-in-polygon type
    queries.
    Hope that helps.
    For my two cents, I think SDO_GEOMETRY gives you a more robust
    database to work with, because you have the added power of
    Oracle Spatial functions. If you are using SDEBINARY you are
    limited to only what you can do thru ArcGIS.
    If you are concerned more about performance than accessibility,
    especially with a large number of users, then SDEBINARY might
    be the better choice.
    I love Oracle Spatial and am hoping that the performance issue
    will not be a serious one when we start putting ArcIMS developed
    apps into production.
    Dave

  • Passing values to subreport in SSRS throwing an error - Data Retrieval failed for the report, please check the log for more details.

    Hi,
    I have the subreport calling from the main report. The subreport is based on MDX query agianst the SSAS cube. some dimensions in cube has values 0 and 1.
    when I try to pass '0' to the sub report as the parameter value, it gives an error "Data Retrieval failed for the report, please check the log for more details".
    Actually I am using table for storing these parameter values. In the main report I am calling this table (dataset) and passing these values to subreport.
    so I have given like [0],[1] and this works fine. when I give only either [0] or [1] then it is throwing an error.
    Could you please advise on this.
    Appreciate all and any help.
    Thanks,
    Divya

    Hi Divya,
    Based on the current description, I understand that there is no issue if you pass two values from main report to subreport, while the issue occurs when passing one value to subreport.
    To narrow down the issue, I want to confirm whether the subreport can run if there is only [0] or [1] in the subreport. If so, it indicates the query statements exist error in the subreport. If it’s not the case, this shows the issue occurs during passing
    values from main report to subreport. To make further analysis, please post the details of query statements of the subreport to the forum.
    Regards,
    Heidi Duan
    Heidi Duan
    TechNet Community Support

  • Slow data retrieval on 8i

    I am using oracle 8i database on Windows 2000 server on compaq
    proliant 350 server machine. The problem is that sometimes
    connecting from a client is very slow and after connection data
    retrieval is also slow. I am usring TCP/IP and net8. I came to
    know from internet that there is a patch which is actually a
    work around to solve this problem. Can anybody help me to locate
    this patch?
    Thanks in advance.
    G. Rajan.

    You can probably prove that this is the issue by creating a retrieve using the excel addin or smart view to replicate the form and see how long it takes to retrieve.
    You will also see in the essbase app logs how long it is taking to perform the retrieve.
    Cheers
    John
    http://john-goodwin.blgspot.com/

Maybe you are looking for

  • How do I sync my existing iPhone to a different Mac?

    I have an iPhone 3G loaded w/ apps, playlists, etc. that is currently synced to my work Macbook. My Macbook has a LOT of apps, many which aren't on my iPhone, only synced as needed. I want to keep the apps and sync it with my Mac Mini at home. I am w

  • One WLC for Headquarter and Remote Site

    Hi I have a question about the WLC remote deployment. We have the following design at the moment: Headquarter - Network 192.168.49.0 /24 - WLC 4402 Version 4.2.61.0 -- 3 x LAP1252 -- Layer 3 LWAPP -- SSID wep -- SSID wpa - Windows PDC with Active Dir

  • HP Photosmart 7520 all in one showing Air Print Macbook Pro Maverick 10.9.4

    At one point I did not have Air Print showing beside printer in system preferences. Printer and scanning is working fine, but I was told by someone on this message board that Air Print should not show. How do I remove Air Print. I have a Macbook Pro

  • Help!!! Palm Desktop Error

    I just upgraded to Windows 7 and re-installed the Palm Desktop v3 from the CD. But it fails to open the Date Book database displaying an error message to that effect and the app doesn't launch. Do I have to run a HotSync operation first? The file had

  • Losing clip preview in Viewer after paste

    Running FCPX 10.1.1. I think this started to happen after the 10.1 upgrade. In the Timeline, I scrub to a clip, and  what's under the playhead displays in the Viewer. So far so good. Now, I select and copy a clip in the Timeline, and then move the pl