Regarding detail data coming in a grid in a PO Report

Hi
I have a requirement where in the detail data need to be printed in a grid. I have placed this in the jfmain page and all the subforms in the next page.
when i run the report for multiple purchase order numbers the grid is not coming from the second Purchase order number but the data is coming correctly.
I am using ^eject for page break after each PO related data.
Can any one help me in this regard.
TIA
vijay
[email protected]

Hi abilash n      
as in dictionary, all quantity fields should have a reference to the unit field which then defines the number of decimals.
The rest is done automatically as long as you do it standard way. In field catalog you have to set the fcat-qfieldname to the name of the field that has the unit.
If you do so, also totals will be grouped by unit.
SAP has really done a great thing by inventing  units for quantities (and currency keys for currency amounts). Only developers do not understand and thus refuse to use it.
Regards,
Clemens

Similar Messages

  • Details Data in SAP BI could not be updated completely: see log

    Hi ,
    I need some help, while creating the campaign i am getting below error
    Details Data in SAP BI could not be updated completely: see log
    Diagnosis
    A problem occurred as data was being saved in the BI system.
    System Response
    The data can not be completely saved in the BI system.
    Procedure
    For more information about the reasons for this, see Messages
    I did the setting setting in SPRO to activate BI update and Multipal BW hierarchy
    SPRO->CRM->Marketing->Marketing Planning and Campaign Management-> System landscape->"Activate BI update & BI multi Hier.
    But in BI i am not getting mltipal hierarchies it coming under "CRMMKTROOT" during extraction not on the save of promotion.
    Appreciate if you please help me on this issue.
    Regards,
    Raj Mittal

    Hi ,
    I need some help, while creating the campaign i am getting below error
    Details Data in SAP BI could not be updated completely: see log
    Diagnosis
    A problem occurred as data was being saved in the BI system.
    System Response
    The data can not be completely saved in the BI system.
    Procedure
    For more information about the reasons for this, see Messages
    I did the setting setting in SPRO to activate BI update and Multipal BW hierarchy
    SPRO->CRM->Marketing->Marketing Planning and Campaign Management-> System landscape->"Activate BI update & BI multi Hier.
    But in BI i am not getting mltipal hierarchies it coming under "CRMMKTROOT" during extraction not on the save of promotion.
    Appreciate if you please help me on this issue.
    Regards,
    Raj Mittal

  • How to store the data coming from network analyser into a text or excel file

    Hii everyone
    I'm using Agilent 8719ET network analyser and wish to store the data coming from netowrk analyser into a text file/ excel file.
    Presently I'm able to get the data on Labview graph using GPIB . Can anyone suggest how to go ahead after collect data sub vi. How can the data be stored into a file apart from showing on the graph?
    Attached is the vi for kind consideration...
    Looking for help
    Regards
    Rohit
    Attachments:
    Agilent 87XX Series Exceed Max Meas.vi ‏43 KB

    First let me say that your code really looks pretty good. The data handling could be made more efficient by calculating the number of datapoints that are going to be in the completed dataset and preallocating the entire array -- but depending upon your answer to my questions, the logic in the lower shift register may be going away - so we won't worry about that right now.
    The thing I need to know before addressing the data storage question is: Each time you call "Collect and Display Data.vi", how many element are in the array? Are you reading single data points, or a group of data? (BTW: if the answer to that question is obvious based on the way the other VIs are setup, I don't have the drivers so I can't tell what the setup values are.) Second, how fast does the loop iterate? Are we talking msec per loop?, seconds? fortnights?
    The issues here are two-fold: how much data? and how fast is it coming? The answer to these will tell you how to save the data.
    Mike...
    Certified Professional Instructor
    Certified LabVIEW Architect
    LabVIEW Champion
    "... after all, He's not a tame lion..."
    Be thinking ahead and mark your dance card for NI Week 2015 now: TS 6139 - Object Oriented First Steps

  • (ID 30101 Details: Data error (cyclic redundancy check) (0x80070017)) _More information

    Hi,
    Can anyone please help with this?  Our Tape Drive died and we got a new one from our Maintenance Company. Since then large backups to tape are failing.  When the jobs are small they seem to go through OK but when the job is greater than around
    200GB then it fails with the error below.
    I tried updating the tape drive driver but Windows said that the drive wasn't functioning properly any more.
    The only difference between the old faulty drive and the new drive is the firmware.  We used to have A422 in the old drive.  The new drive has  D8D5.  Can someone please help as it's driving me crazy.
    "Affected area:
    IBM TotalStorage 3573 Tape Library
    Occurred since: 03/03/2015 10:19:37
    Description: Library drive IBM ULTRIUM 5 HH 3580 TAPE DRIVE (1068006186) in IBM TotalStorage 3573 Tape Library is not functioning and library jobs may fail until the drive is repaired.
     The drive is not functioning for the following reason:
     (ID 3303)
    DPM encountered a critical error while performing an I/O operation on the tape Physical Servers-LT-MonthlyBackup-00000235 (000086L4) (Barcode - 000086L4) in Drive IBM ULTRIUM 5 HH 3580 TAPE DRIVE (1068006186).
    (ID 30101 Details: Data error (cyclic redundancy check) (0x80070017))
    More information
    Recommended action:
    Retry the operation. If the problem persists, contact your hardware vendor.
    No action required
    Resolution: To dismiss the alert, click below
    Inactivate"
    Regards
    Willie

    Hi,
    It sounds like another faulty drive - You can try to reproduce the problem outside of DPM using some external utilities.   If you get an error before the tape fills you can use net helpmsg errorcode to see what the error was.
    Download the DPMerasetape.zip file from the following link and extract to c:\temp folder.
    https://onedrive.live.com/?cid=b03306b628ab886f&id=B03306B628AB886F%21524&sc=documents
    The utilities are not that user friendly, but here are the basics.
    Always Stop DPMLA Service prior to running MCT.EXE Commands.
      NET STOP DPMLA
    C:\> mct-x64.exe -p
    Opening changer \\.\Changer0
         ********** Changer Parameters **********
             Number of Transport Elements : 1
             Number of Storage Elements : 50
             Number of Cleaner Slots : 0
             Number of of IE Elements : 0
             Number of NumberDataTransferElements : 6
             Number of Doors : 0
             First Slot Number : 0
             First Drive Number : 0
             First Transport Number : 0
             First IEPort number : 0
             First Cleaner Slot Address : 0
             Magazine Size : 0
             Drive Clean Timeout : 600
      Flags set for the changer :
             CHANGER_BAR_CODE_SCANNER_INSTALLED
             CHANGER_POSITION_TO_ELEMENT
             CHANGER_STORAGE_DRIVE
             CHANGER_STORAGE_SLOT
             CHANGER_DRIVE_CLEANING_REQUIRED
             CHANGER_VOLUME_IDENTIFICATION
             CHANGER_VOLUME_SEARCH
             CHANGER_SERIAL_NUMBER_VALID
     Changer can move from Slot to :
                     Slot
                     Drive
     Changer can move from Drive to :
                     Slot
                     Drive
     Changer is Capable of positioning transport to Slot.
     Changer is Capable of positioning transport to Drive.
    C:\> mct-x64.exe -d
    Opening changer \\.\Changer0
    Product Data for Medium Changer device :
      Vendor Id    : STK
      Product Id   : L180
      Revision     : 030
      SerialNumber : 3077520000
    For MCT utility we have the  -m [MOVE] command to move media around inside the library.
    -m [ElemType-T] Transport# [ElemType-Source] S_lot#/D_rive# [ElemType-Destination] S_lot#/D_rive#
    Get / view command syntax for –m (move) command for changer 0
    C:\>mct-x64 0 -m
    Opening changer \\.\Changer0
    MoveMedium : mct -m t N s\d N s\d N   [Where s/d means Slot or Drive and N is ZERO based].
    Some Examples:
    mct-x64 -m t 0 s 0 d 0    (Using transport-0, move media from slot-0  to drive-0)
    mct-x64 -m t 0 d 0 s 0    (Using transport-0, move media from drive-0 to slot-0)
    mct-x64 -m t 0 s 0 s 100  (Using transport-0, move media from slot-0  to slot-100)
    mct-x64 -m t 0 d 0 d 1    (Using transport-0, move media from drive-0 to drive-1)
    mct-x64 -m t 0 s 0 ie 0   (Using transport-0, move media from slot-0  to IEPort 0)
    Once you move a tape into a drive, use mytape commands Loadtape, taperewind, locktape, Disable hardware compression, Set block size to 65536 (64K), writeforspanning.
    You need the symbolic name for the tape drive you loaded media into - look in the DPM console by clicking the tape drive and look at the details for
    \\.\tape########.  use that in the following command.
    Mytape.exe \\.\Tape2147483638
    Status: Getting the handle for \\.\Tape2147483638...Success
    TapeConsole_1.0>taperewind">\\.\Tape2147483638>TapeConsole_1.0>taperewind
    Status: Rewinding Tape ...Success
    TapeConsole_1.0>setdriveinfo">\\.\Tape2147483638>TapeConsole_1.0>setdriveinfo
    Hardware error correction  [y]-Enable / [n] Disable : y
    Hardware data compression  [y]-Enable / [n] Disable : N   (BE SURE TO DISABLE)
    Data padding  [y]-Enable / [n] Disable : n
    Setmark reporting   [y]-Enable / [n] Disable : n
    Number of bytes between the end-of-tape warning and the physical end of the tape: 0
    Status: Setting Drive Information...Success
    TapeConsole_1.0>writeforspanning">\\.\Tape2147483638>TapeConsole_1.0>writeforspanning
    Status: Writing onto tape...Failed !!!
    Error_ID reported: 1100                 (net helpmsg 1100
    = The physical end of the tape has been reached.
    Number of bytes written: 983040     (Ignore bytes written, we'll get physical tape position later)
    Giving up
    Time taken: 15788ms
    TapeConsole_1.0>taperewind">\\.\Tape2147483638>TapeConsole_1.0>taperewind
    Status: Rewinding Tape ...Success
    REPEAT
    TapeConsole_1.0>erasetape">\\.\Tape2147483638>TapeConsole_1.0>erasetape s
    Short erase / Long Erase [s/l]:Status: Erasing the tape...Success
    \\.\Tape2147483646...Success
    c:\>mct-x64.exe -m t 0 d 0 s 0
    Opening changer \\.\Changer0
    Source is a Drive
    Destination is a Slot
    Move : Transport - 0, Src - 0, Dest - 0
    Please remember to click “Mark as Answer” on the post that helps you, and to click “Unmark as Answer” if a marked post does not actually answer your question. This can be beneficial to other community members reading the thread. Regards, Mike J. [MSFT]
    This posting is provided "AS IS" with no warranties, and confers no rights.

  • Regarding downloading data into excel from alv output

    hi experts,
    i have developed a customized report(ALV) in which the employee number and all details are coming but when i press
    download to excel button and get downloaded it to the excel sheet the employee number is only not coming but all details are coming in the excel sheet.please help me to short out this problem.
    Also what i found that when i press print preview button then employee number here is also not coming but rest
      of the details are coming.

    Hi Ravi,
               For downloading the data from ALV to Excel use the following FM's
    SAP_CONVERT_TO_XLS_FORMAT
    GUI_DOWNLOAD.
    In the function GUI_DOWNLOAD use the filetype as "DBF" due to which all the data gets downloaded perfectly.
    CALL FUNCTION 'GUI_DOWNLOAD'
        EXPORTING
          filename                        = fname
          FILETYPE                        = 'DBF'
        APPEND                          = ' '
        WRITE_FIELD_SEPARATOR           = ' '
        HEADER                          = '00'
        TRUNC_TRAILING_BLANKS           = ' '
        WRITE_LF                        = 'X'
        COL_SELECT                      = ' '
        COL_SELECT_MASK                 = ' '
        DAT_MODE                        = ' '
        CONFIRM_OVERWRITE               = ' '
        NO_AUTH_CHECK                   = ' '
        CODEPAGE                        = ' '
        IGNORE_CERR                     = ABAP_TRUE
        REPLACEMENT                     = '#'
        WRITE_BOM                       = ' '
        TRUNC_TRAILING_BLANKS_EOL       = 'X'
        WK1_N_FORMAT                    = ' '
        WK1_N_SIZE                      = ' '
        WK1_T_FORMAT                    = ' '
        WK1_T_SIZE                      = ' '
        WRITE_LF_AFTER_LAST_LINE        = ABAP_TRUE
      IMPORTING
        FILELENGTH                      =
        tables
          data_tab                        =  itab
        FIELDNAMES                      =
      EXCEPTIONS
        FILE_WRITE_ERROR                = 1
        NO_BATCH                        = 2
        GUI_REFUSE_FILETRANSFER         = 3
        INVALID_TYPE                    = 4
        NO_AUTHORITY                    = 5
        UNKNOWN_ERROR                   = 6
        HEADER_NOT_ALLOWED              = 7
        SEPARATOR_NOT_ALLOWED           = 8
        FILESIZE_NOT_ALLOWED            = 9
        HEADER_TOO_LONG                 = 10
        DP_ERROR_CREATE                 = 11
        DP_ERROR_SEND                   = 12
        DP_ERROR_WRITE                  = 13
        UNKNOWN_DP_ERROR                = 14
        ACCESS_DENIED                   = 15
        DP_OUT_OF_MEMORY                = 16
        DISK_FULL                       = 17
        DP_TIMEOUT                      = 18
        FILE_NOT_FOUND                  = 19
        DATAPROVIDER_EXCEPTION          = 20
        CONTROL_FLUSH_ERROR             = 21
        OTHERS                          = 22
      IF sy-subrc <> 0.
    MESSAGE ID SY-MSGID TYPE SY-MSGTY NUMBER SY-MSGNO
            WITH SY-MSGV1 SY-MSGV2 SY-MSGV3 SY-MSGV4.
      ENDIF.
    Pls reward if useful.
    Thanks,
    Sirisha.

  • GRC 10 - Legacy connector as user detail data source

    Hello, 
    I'm trying to use a legacy connector (with a text file as input) as a user data-source.
    Repository user sync for this legacy connector works : checked GRACUSER table, it is populated with all the user details from the input file (id,firstname,lastname,mail,department,phone
    I got it working for user search data source : when creating an access request for "other" user, searching for a user ID/name works : data are displayed in search result, however when I select the user from the serach result the user details are not populated in an access-request form.
    Any clue about this ? Any one already got this working ?
    GRC 10.0 SP13.
    Checked SP14 and SP15 release notes, and found no relevant notes yet.
    repository-related notes applied :
    -1864423
    -1950231
    Regards,
    Emmanuel.

    Hi Pedro,
    You only have confirmed that 2 accounts are maintained in HCM and in SU01 as well, so you would be able to see these accounts' details both ways.
    Yes, you are right about user account maintenance first in HCM at the time of new hire, then you can manually raise the access request to grant them access to various SAP systems. Or in order to automate this process as Prasahant suggested, you can take help from HR Triggers.
    You can refer: GRC 10.0 - HR Trigger configuration - Governance, Risk and Compliance - SCN Wiki
    But responding to your original discussion, whatever user accounts are maintained in HCM you would see those details provided you define HR for the "user search data source" AND from SU01 for "user detail data source"
    In your case you have 2 accounts which have been maintained in HCM as well as SU01, so that is what creating confusion for you.
    Let us know if you need any more clarifications.
    Regards,
    Ameet

  • How to insert data coming from 2 different file adapters in to one DB adapt

    Hi
    i want insert data in to database containing two diffferent tables, so i imported tables in to DB adapter by creating relation ships.But, data for two tables are in xml format & two are in different locations.So, i used 2 file adapters to get data from 2 different & i used BPEL(Define service later) Service. now in bpel i used receive activity to receive one file adapter data ( i checked create instance in receive) then used transform activiy for tranformation & finally invoke activity to invoke DB adapteer........similarly i repeated sequetially to 2 file adapter, by keeping 2nd receive(no need to check create instance) next to invoke.*Problem is after deployment finished only data coming from 1st receive is inserting to the table...& 2 nd receive not working it showing as Pending & showing as Asynchronus Call back inte console*
    I configured all the adapters perfectly..........Can any one can help me how to Commit 2 nd receive to insert data in the 2nd table
    Regards,
    jay

    Thank u both for ur replay.........
    I am doing this in 11g there is no problem regarding transform activity.
    My requirement is
    two different files from two different folders in a drive & we can't use one file adapter bcoz both have different columns(only few are common columns) so we use two different xsd's .So, i am using two file adapters to insert in database having two different tables with respect to two different files data coming from two file adapters. i am using one DB adapter to insert bcoz both tables are in same database with relationships & i used BPEL(define service later) .
    NOW PLEASE SUGGEST ME THE FLOW IN BPEL TO INSERT BOTH FILES IN THERE RESPECTIVE TABLES IN DATABASE.
    The flow i did 1st file adater--->receive--->transform---->invoke----->DB adapter.....Then again repeating this as keeping 2nd receive below 1st invoke
    2nd file adapter-------->2nd receive---->2nd invoke------>same DB adapter
    MY problem is only data coming from 1st process is inserting & 2nd one is not working as i discussed earlier........I USED READ FILE OPTION, UNCHECKED DELETE FILES OPTION & SET DIFFERENT POLLING FREQUENCY FOR BOTH FILE ADAPTERS.
    I tried to set correlation but it is not working & later tried i kept non-blocking invoke as TRUE in DB adapter also didn't work...........also i tried this transaction property in bpel component _<property name="bpel.config.transaction"_
    many="false" type="xs:string">required / requiresNew</property>...............BUT NO CHANGE...........
    Regards,
    jay
    Edited by: 910162 on Apr 5, 2012 12:38 AM

  • Adf master -detail data table

    Hi,
    I have adf master detail data table , I put af:selectbooelancheckbox and autosubmit is true in detail data table .When I click the detail table , valuechangelistener of the af:selectbooelancheckbox is runnig I don't want this situation. Is it possible to prevent this ?
    regards,

    Hi
    I can't remove autosutmit = true because I handle the valuechangeevent in managedbean.Why master table event fire the valuechangeevent in detail table ? I check the submit source in managedbean but It doesn't prevent to fire valuchangeevent ? Is there any way except ignore autosubmit = true ?
    thx,

  • User Details Data Source

    Hello all,
    I´m working to configure the user search data source and also user details data source from our GRC AC environment. Bellow my doubt:
    Can I configure GRC AC to automatically fill the Manager field in the access request screen? Obviously the User Details Data Source must be configured. Is it possible using SU01? HR? LDAP? All of them? Some examples would be really appreciated.
    In other words:
    When an Access Request is made, I want all User Details filled automatically, including the Manager.
    Regards,
    SAP Legend

    Hi,
    Yes you can configure the manager look up functionality by configuring the detail data source in the IMG and make sure you do all the configurations respective to what data source you are using.
    If you are using LDAP then make sure you have done the mapping for your AC field name and target system field name and all the LDAP related configurations.
    If you are using HR system as the data source please check the below link.
    Configure Manager Look-Up in ARM for GRC 10
    Regards,
    Neeraj

  • Switchover in a data guard environment using Grid Control 10.2.0.3

    I've tested switchover in a data guard environment using Data Guard Broker in Grid Control.
    However, at times, i receive the message "RemoteOperationException: failed to establish input streaming thread". It looks like it has problems connecting to the remote node using the host credentials supplied. I know the credentials are ok because this worked before. If i test preferred credentials, it's ok too..
    The workaround has been to restart the database and this seems to work.
    Has anyone experienced this?

    Thanks for All for replying.
    *1. How can I upgrade my Grid control 10.2.0.3 to 10.2.0.4?*
    I have upgraded the Grid Control Agent and OMS from 10.2.0.3 to 10.2.0.4
    When we upgrade the OMS which also upgrade the Repository database.
    Apply the patchset p3731593_10204_Linux-x86-64.zip (Which comes with Grid_Control_10.2.0.4.0_Linux-x86-64.zip)
    su - oracle
    cd 3731593/Disk1
    ./runInstaller (Run two times for Agent and oms upgrade)
    Agent upgrade_
    During upgrade
    Specify Home Details: Select the Agent Home to upgrade Grid Control Agent.
    Ex: /u01/app/oracle/OracleHomes/agent10g
    OMS upgrade+
    Specify Home Details: Select the oms home to upgrade oms
    Ex: /u01/app/oracle/OracleHomes/oms10g
    *2. How can I monitor/Connect my Existing database 10.2.0.4 from Grid Control 10.2.0.3?*
    Install Grid Agent on the existing database server 10.2.0.4
    Download Linux_x86_64_Grid_Control_agent_download_10_2_0_4_0.zip from OTN http://www.oracle.com/technology/software/products/oem/htdocs/agentsoft.html
    su - oracle
    cd /u01/software/GridAgent/linux_x64/agent
    ./runInstaller
    Refer:
    [To Install an Additional Management Agent Using OUI|http://download.oracle.com/docs/cd/B16240_01/doc/install.102/e10953/installing_em.htm#sthref318]
    Oracle® Enterprise Manager Grid Control Installation and Basic Configuration
    10g Release 4 (10.2.0.4)
    Part Number E10953-05
    3 Installing Enterprise Manager
    Thanks
    Mukarram Khan
    Edited by: Mukarram Khan on Feb 6, 2009 11:04 PM

  • User Details Data Source in CUP 5.3

    Dear GRC Gurus,
    Iam configuring CUP 5.3., in the User data source (which is used to fetch users,approvers,managers from backend)  there is User Details Data Source -> i select SAP and i get the system name -> There is a Field Function Template -> there are two options, standard and Custom. 
    What is the use of Function Template ?
    What is standard and Custom?
    If we select Custom, what should we enter in Function Template Name?
    Can you please clarify
    Thanks a lot...
    Regards
    Selva

    Hi,
    The user data source only reads the user details for use in defaulting the information into request forms / workflow.
    I believe that the function template just tells the system whether to use standard fields within the SAP user master or whether you have requirements to use alternative field mappings.
    I don't think that the custom template name matters as it is identified.
    I must admit that I haven't used it so I may be wrong but that is my current understanding!
    Regards, Simon
    Edited by: Simon P Persin on Oct 26, 2009 4:40 PM

  • Problem in data coming to PI system from SAP?

    Hello All,
    I facing problem with the data coming from SAP system to PI system.
    in VT02n give the shipment number and after that create some thing and trigger an IDOC.
    when the data come to PI system, i checked in moni i saw that  data getting mixed.
    means for example:
    IDOC data in we19:
    data1 --- abcdef
    data2----12345.00
    When i check idoc data in PI moni:
    <data1> abcdef   1<data1>
    <data2>2345.00<data2>
    where and how the data in second tag coming into first tag. where this can happen.
    any one faced this kind of problem, please let me know.
    Thanks and Regards,
    Chinna

    problem solved, i have not imported latest meta data into PI in idx2.
    now i have done it, now its working fine.

  • Compare dates coming form Source system & update higher one in target syste

    hi all,
             Mt reqt is to compare dates coming from source system & update highere one
    Ex--) E1EDP01 has been repeated 3 times in Idoc segemnt E1EDP20 , then date with higher one need to be updated in target system
    like 14/12/08
          15/12/08
          16/12/08
    Here 16/12/08 need to be updated first & then the other ones one by one.
    Anybody guide me in comparing these dates functionality .
    Send me the code !
    Regards
    Chaithanya

    Hi Michael,
                     I was unable to trace the exact issue of how to track  E1EDP01 Dates & compare it.
    I had created UDF to compare dates coming from E1EDP01 Segment
    following is my code --->
    // This UDF return 1 value for highest date and 0 for not highest date.
    DateFormat mydateformat = new SimpleDateFormat("ddMMyyyy");
    Date mydate1 = null;
    Date heightDt = null;
    for (int i=0; i <a.length; i++)
      if (a<i>.equals(ResultList.CC)) continue;
      try{
         mydate1 = mydateformat.parse(a<i>);
         if (i==0)
          heightDt  = mydateformat.parse(a<i>);
         if  (heightDt .before(mydate1))
         heightDt  = mydate1;
    catch(Exception e){}
    //result.addValue(  mydateformat.format(heightDt));
    for (int i=0; i <a.length; i++)
    try{
    mydate1 = mydateformat.parse(a<i>);
    if  (heightDt .equals(mydate1))
    result.addValue( "1");
    else
    result.addValue( "0");
    catch(Exception e){}
    My Problem here is in my Idoc if there are  Multiple E1EDP20 segments & here date would be repeated 4 times . I have to compare highest date out of 4 & send it to the target system.
    which I have done it, but here my problem is  the segment E1EDP01 is repeated & iam unable to find highest date individually to the node level.
    Sorry I should have explained this before only.
    Can Anyone guide me in comparing the date at segment level.
    Regards
    Chaithanya

  • Unable to view the details data  in "Employee Self-Service 4.0" of HRMS

    Unable to view the detail data in "Summary of Absences" when click the "Leave of Absence " button in "Employee Self-Service 4.0",
    the whole login path is : "Employee Self-Service 4.0" ===> "Leave of Absence "
    It shoud be seen the "Summary of Absences " ,bacause there are data in the datebase ,but when I click in it shows "No data exists."

    Dear user11977612 :
    Thank you for your answer, I have been check the profile :HR: Self Service HR Licensed, It had set YES. Thank you! But it still no records
    Dear Naveen:
    The page can open with IE7 and it just can not be shown the detail records (llike in "Absence Type" the detail ===>Advance Leave ) in the IE page.
    Regards,
    Edited by: user11975899 on 2009/10/13 下午 7:17

  • Regarding genric data source

    hi guyz,
         i have some doubts regarding generic data source for the following qeastion.
       1).what is ALE pointer(ALE delta)?.in which senario we use this?.can u explain me with one realtime senario so that i can understand easily.
       2).what is SLA? explain with one senario.
       3).when we use these t-codes: RSMO,SM50 in process chain.

    1. ALE is Application Link Enabling (ALE)
    It is the set of tools, programs, and data definitions that provides the mechanism for distributing SAP functionality and data across multiple systems. ALE enables the construction and operation of distributed applications.
    Its purpose was to overcome the limitations of a single SAP system. A single SAP system that runs on top of one database often does not fulfill the needs of larger corporations, either from a business or a technical perspective.ALE allows the implementation of loosely coupled SAP systems; each of the SAP systems has its own database and is essentially independent from the other systems. ALE allows us to distribute data between different systems and different business processes.
    ALE enables you to transfer data(master/transactional data) from an SAP system to SAP/Non-SAP system you can use ALE.This done using IDOCs.
    ALE can be divided into
    Output Process: Extracting data from database and putting it in the IDOC.
    Communication Process: Which involves transferring the IDOC to the target system.
    Inbound Process: Posting the IDOC data into the tables of the receiver system.
    2. SLA is service level agreement, what you need on this ?? Its an agreement betn client and service provider regarding the service request and their resolution. Like P! issue should be solved in 1 hr , P2 issue 3 hr..something like that, varies from client to client
    3. RSMO is used to monitor the data load in BW. You have several options to restrict your criteria to monitor load.
    In SM37 if you goto the job and double click on that, it brings you to another screen with some details for the job. IN this screen if you hit the job details button in the resulting screen, it'll show you a PID number and an executing server number. With this combination, you can check the job in SM50.
    Assign Points if helpfull.
    Thanks
    Tripple k

Maybe you are looking for