HELP: need to do version comparison for large no.of programs

i need to do version comparison for a large set of programs(approx.4000).anybody has any technique to do it fast,plz lemme knw.

Hi
try using this FM
/SDF/CMO_COMP_VERSION
AKB_VERSION_COMPARE
Regards
Shiva

Similar Messages

  • MORE HELP: VERSION COMPARISON OF LARGE NO.OF PROGRAMS

    I   need to do version managemengt of large no. of programs, need some help in it.
    ps: shiva ur function module doesnt run in any system,tell me in which does it run

    Hi,
    U can go through the transaction SE09 after that u have to find your request nos and release it.
    If that system  function module source code is there know ? First u can check that also man.
    Thanks
    shankar

  • Version comparison for BW objects

    Hello guys,
    How to see versions of BW objects like, Infocube, Multiprovider, datasource? I wanted to do the version comparisons for few objects in Dev with respect to Production. How do I make sure that both have the same versions and there is no harm in transporting it from dev to prod.
    In case we already did the changes in dev before checking the comparison, what do we do?
    Please give me complete process of version comparison and where exactly I could see for versions.
    Please reply asap.
    Thanks in adv.
    Regards,
    BMW 325ci

    Unfortunately, there is no tool delivered with SAP BW that provides the ability to compare objects across the landscape. There are some third-party tools that provide this functionality.
    You could create your own by create an ABAP program, or several ABAP programs, using the following BAPI Function Modules to get the details of the objects in each environment and then do a comparison:
    InfoCube - BAPI_CUBE_GETDETAIL
    InfoObject Catalog - BAPI_IOBC_GETDETAIL
    InfoObject - BAPI_IOBJ_GETDETAIL
    InfoPackage - BAPI_IPAK_GETDETAIL
    InfoSet - BAPI_ISET_GETDETAIL
    DSO - BAPI_ODSO_GETDETAIL
    MultiProvider - BAPI_MPRO_GETDETAIL

  • Help needed in Finding Download location for Sun One Portal 7

    Hi,
    help needed for finding download location for Sun ONE Portal 7. I tried to find in Oracle Download page ,
    http://www.oracle.com/us/sun/sun-products-map-075562.html, But unable to find.
    Please share the link for download location.
    I am totally new in Sun ONE Portal.
    Thanks,
    Edited by: 945439 on Oct 5, 2012 3:41 AM

    try edelivery.oracle.com under sun products.

  • Help, error the 1080p version is to large to play

    I recently purchased The avengers (HD) from the iTunes store and keep getting an error saying the 1080p version is to large to play on this computer, only allowing the smaller version to be played.
    My current system stats: windows 7 64 bit, intel core i7-3770k, 16GB ram, 2x GTX 680 sli mode, 128gb ssd, 2tb HDD and a samsung 27" 3d monitor.
    Any help would be appreciated

    Bump.....anyone else have this.  I am getting this and I'm not sure why.  Is it iTunes and playing files > 4GB?

  • Help needed on online theme creation for mobile phones

    Hello everybody,
    I want to create an web application which will create themes for different mobile phones. In that application end user can upload jpg/gif images of there choice and select the mobile phone make like Nokia and also the model number like 6030. After that they can create their desired theme by clicking on a button and also can download it.
    My main problem is how to convert an image into a mobile phone theme (*.thm or *.nth or *.sis).
    Can anybody give any suggestion on this matter?
    Thanks in advance.
    Tanmoy

    Hi everybody,
    My main problem is how to convert an image into a mobile phone theme (*.thm or *.nth or *.sis).
    Please give me any guideline that I can proceed.
    Help needed.
    Thanks in advance.
    Tanmoy

  • Help needed - Installing EPPM 8.2 for existing PMDB version 8.2

    Hi
    Our users are currently accessing primavera using P6 Professioanl Client version 8.2 to a PMDB version 8.2 without any issues.
    Now I have been assigned the task of installing Web access for the same. For this, I restored the DB to a test machine and installed P6 EPPM v 8.2 in it. When I try to connect to the existing PMDB, the confogiration tool says it is not an EPPM database. Same is the result when I try to upgrade the DB. Only option left is to create a new DB.
    Can some one please guide me how to do this properly.
    Thanks in advance
    JeVee
    Edited by: user13673523 on May 8, 2012 9:44 PM

    Hello,
    1)I am going to try the export/import option using P6 professional client. Could you please let me know whether I be able to copy all the data from existing DB to the new EPPMDB using this method ?
    Everything related with the projects, but not related with the application, you can't export layouts, users, privileges, user preferences, etc in the xer files.
    2)I followed the manual upgrade procedure and then installed P6 once again. This time, the configuration tool accepted the DB and completed the deployment.
    But nothing seems to work except the weblogic admin application. (http://server:7001/console). All other URLs (P6, PR etc.) are showing 'page cannot be displayed', even though the status of all the applications is 'Running' in the weblogic admin
    Well, in my experience I know that is not possible to upgrade from PPM to EPPM because the database schemas are differents. I don't know how you did it.... It seems that your issue with the web applications is a different issue and not related with the database, because of the error: 'page cannot be displayed' is related to the web applications and not related with your database issue. I believe that you are trying to say that the managed server are running (server tab) and the applications should be in "active" status,right?
    3)Do you think I can migrate the entire DB following the above guide to version 8.2, instead of doing the export import.
    that guide is to upgrade from a eppm database to a eppm database. I'm almost sure that upgrade from ppm to eppm will give you more issues.
    You can check if your database is ok, runing the validate.bat file (in the media folder) and also you can test your database using the EPPM P6 windows client, instead of P6 web access, for test purposes only.
    Best Regards

  • Help Needed Configuring Post Set Up For Big Indie Feature:

    We’ve got incredible performances from an amazing cast of well-known character actors from film, TV and stage, a unique and inspiring script, and some truly beautiful, cinematic footage. And now — it’s all about putting it together…
    But I’m having trouble finding an accurate, effective and (most importantly) a DETAILED workflow accommodating the latest version of Premiere, to post a LENGHTY feature film shot on the RED Epic in 5k FF.  I’ve reviewed videos here, but in conducting edit tests, I'm encountering all manner of glitches and problems, and need to ensure that I am properly configuring our post workflow and all of the associated hardware.
    I’ll be dividing the film into project file “reels” to help keep things manageable.  As editor and DP, it's also most important for me to edit on a 4k timeline (to take advantage of reframing and stabilization of the 5k). Footage is on three Pegasus II RAIDS.
    There will be a LOT of FX work done outside of Premiere in AE (and other compositing and graphics programs).  I’m planning to finish in Resolve, outputting at 4k.
    I have the new Mac Pro with:
    - 2.7GHz 12-core Intel Xeon E5 with 30MB of L3 cache
    - 64GB (4x16GB) of 1866MHz DDR3 ECC - 4X16GB
    - 1TB PCIe-based flash storage
    - Dual AMD FirePro D700 GPUs with 6GB of GDDR5 VRAM each
    A Sonnet Echo Express III Desktop 3-Slot (with Thunderbolt upgrade) housing the following:
    - Red Rocket
    - Connection card for HP LTO5 Ultrium 3000 Sas Ext Tape Drive
    If needed, I’m open to acquiring other cards or hardware, too (possibly adding the Red Rocket-X to the mix).  And if the referral of a PAID individual with firsthand knowledge of such a setup – to at least help with the initial set up and configuration – is what I need, I’m open to that, as well.
    We truly have an amazingly powerful compilation of principal photography, and a great story to tell — I just need to overcome this major hurdle of setting up our post.  As a somewhat newbie to Premiere, I greatly appreciate any advice, pointing-me-in-the-right-direction, or suggestions of individuals to help oversee this for compensation and screen credit that anyone can offer me here...
    Many Thanks,
    Bill

    Oooh...big involved questions and many of them.
    If it helps..
    A few things I know I would do before even starting on the actual film edit
    ..is give the new system a massive shakedown.
    Thrash the hardware  and software with test footage , graphics and audio etc.
    Test the pipelines and workflows.
    Set up a BACKUP routine. DO NOT RELY ON AUTOSAVES
    Would  NOT do  O.S updates once started with a stable system.
    Me...I would not use a Dynamic Link workflow for FX.  (I would use D.Is)
    Would create a flow chart plan for the edit and post prod to avoid generation losses, efficiencies and scheduling.
    Work toward  lock downs before FX , audio, CC and Grade. (Avoid the trap of being creatively impatient for the benefit of a smooth edit experience)
    Sort this first...
    I'm encountering all manner of glitches and problems, and need to ensure that I am properly configuring our post workflow and all of the associated hardware.
    You might want to specify some of this stuff and see if you get answers here or elsewhere..
    Suggestion - can you wait for next version of PPro ? - coming very soon evidently. Start with latest version so you don't need to update midstream.  A feature takes a long time to post and some cool new  features may help. eg  Master Clip enhanced.
    Have you considered Prelude in the workflow to log and set up your project. ( Never used it myself but I would consider/investigate  it for long form)
    Good luck and enjoy the edit process.

  • Help needed with a PS script for network share documentation

    I found a nice PS script that will do what I want, however the output portion seems to be broken. It will output the permissions and details, but not list what share it is referring to... Can anyone help with this?
    Thanks!
    https://gallery.technet.microsoft.com/scriptcenter/List-Share-Permissions-83f8c419#content
    <# 
               .SYNOPSIS  
               This script will list all shares on a computer, and list all the share permissions for each share. 
               .DESCRIPTION 
               The script will take a list all shares on a local or remote computer. 
               .PARAMETER Computer 
               Specifies the computer or array of computers to process 
               .INPUTS 
               Get-SharePermissions accepts pipeline of computer name(s) 
               .OUTPUTS 
               Produces an array object for each share found. 
               .EXAMPLE 
               C:\PS> .\Get-SharePermissions # Operates against local computer. 
               .EXAMPLE 
               C:\PS> 'computerName' | .\Get-SharePermissions 
               .EXAMPLE 
               C:\PS> Get-Content 'computerlist.txt' | .\Get-SharePermissions | Out-File 'SharePermissions.txt' 
               .EXAMPLE 
               Get-Help .\Get-SharePermissions -Full 
    #> 
    # Written by BigTeddy November 15, 2011 
    # Last updated 9 September 2012  
    # Ver. 2.0  
    # Thanks to Michal Gajda for input with the ACE handling. 
    [cmdletbinding()] 
    param([Parameter(ValueFromPipeline=$True, 
        ValueFromPipelineByPropertyName=$True)]$Computer = '.')  
    $shares = gwmi -Class win32_share -ComputerName $computer | select -ExpandProperty Name  
    foreach ($share in $shares) {  
        $acl = $null  
        Write-Host $share -ForegroundColor Green  
        Write-Host $('-' * $share.Length) -ForegroundColor Green  
        $objShareSec = Get-WMIObject -Class Win32_LogicalShareSecuritySetting -Filter "name='$Share'"  -ComputerName $computer 
        try {  
            $SD = $objShareSec.GetSecurityDescriptor().Descriptor    
            foreach($ace in $SD.DACL){   
                $UserName = $ace.Trustee.Name      
                If ($ace.Trustee.Domain -ne $Null) {$UserName = "$($ace.Trustee.Domain)\$UserName"}    
                If ($ace.Trustee.Name -eq $Null) {$UserName = $ace.Trustee.SIDString }      
                [Array]$ACL += New-Object Security.AccessControl.FileSystemAccessRule($UserName, $ace.AccessMask, $ace.AceType)  
                } #end foreach ACE            
            } # end try  
        catch  
            { Write-Host "Unable to obtain permissions for $share" }  
        $ACL  
        Write-Host $('=' * 50)  
        } # end foreach $share
    This is what the output looks like when ran with 'RemoteServer' | .\Get-SharePermissions.ps1 | Out-File 'sharepermissions.xls'
    FileSystemRights  : Modify, Synchronize
    AccessControlType : Allow
    IdentityReference : Everyone
    IsInherited       : False
    InheritanceFlags  : None
    PropagationFlags  : None

    Actually it is not being written only with Write-Host.  The last line of the loop is this "$ACL"  which ius an array of objects. 
    Here is a version that gets the info more easily and produces flexible objects.  It should be easier to modify into what is needed.
    # Get-ShareSec.ps1
    [cmdletbinding()]
    param(
    [Alias('ComputerName')]
    [Parameter(
    ValueFromPipelineByPropertyName=$True
    )]$Name=$env:COMPUTERNAME
    Process {
    Write-Verbose "Computer=$name"
    $shares =Get-WMiObject Win32_Share -ComputerName $name -Filter 'Type=0' -ea 0
    foreach($share in $shares){
    $sharename=$share.Name
    Write-Verbose "`tShareName=$sharename"
    $ShareSec = Get-WMIObject -Class Win32_LogicalShareSecuritySetting -Filter "name='$ShareName'" -ComputerName $name
    try {
    foreach ($ace in $ShareSec.GetSecurityDescriptor().Descriptor.DACL) {
    $props=[ordered]@{
    ComputerName=$name
    ShareName=$sharename
    TrusteeName=$ace.Trustee.Name
    TrusteeDomain=$ace.Trustee.Domain
    TrusteeSID=$ace.Trustee.SIDString
    New-Object PsObject -Property $props
    catch {
    Write-Warning ('{0} | {1} | {2}' -f $Computer,$sharename, $_)
    Get-Adcomputer -Filter * | .\Get-ShareSec.ps1 -v
    ¯\_(ツ)_/¯

  • Help needed: How can I import a large audio collection with many playlists? iTunes crashes because so many playlists try to load and play at once.

    I've got a problem that I can't seem to figure out after reading scads of articles and posts. I'd be grateful for any help or ideas you can provide. Here's what's going on.
    I have a folder of GraphicAudio audio books. It contains roughly 500 subfolders, one for each book. In each folder are the mp3 files for the book and an m3u file that was supplied with the books at the time of purchase. None of this content is currently in my iTunes library.
    When I try to add this folder to my iTunes library, iTunes starts trying to play each playlist it encounters. It crashes after around 30 seconds. I suspect this is because it tries to switch and play playlists so rapidly.
    I don't actually want iTunes to start playing each playlist it finds. I just want to import my files. Once they're imported, I can select them and send individual books to my iPod Touch.
    Do any of you know of a work-around for this? I'm dreading the thought of having to import each subfolder one by one. I keep thinking that if I could just prevent iTunes from automatically trying to play the playlists, it would import the mp3s and playlist info without a problem. If there's no way to do this in iTunes, might there be a third party app that can manage it somehow?
    For reference, I'm using the latest version of 64 bit iTunes under 64 bit Windows 7.
    Thanks for taking the time to help me.

    I've written a script called DeDuper which can help remove unwanted duplicates. See this  thread for background.
    That said you should probably start by "undoing" the creation of extra mp3s. If you sort the song list on the Date Added column you can select then first song from the relevant batch, then scroll to the end of the group, hold shift and click to select the range. You can then delete and send to the recycle bin.
    My script won't be as effective against this group. The "Convert to format" tool may well specifiy a different bit rate from the source file, but it won't ever increase the quality of the audio. Clearly converting 128k mp3 to 256k mp3 would have no benefit so you want the original files.
    If you still have duplicates use the script to thin out what remains.
    tt2

  • Help needed regarding the deployment architecture for PROD env

    Dear All,
    Please help me with some clarifications regarding the deployment architecture for PROD env.
    As of now I have 2 single node 12.1.1 installations for DEV and CRP/TEST respectively.
    Shortly I will be having a PROD env of 12.1.1 with one DB node and 2 middle tier (apps) node. I need help in whether -
    1) to have a shared APPL_TOP in the SAN for the 2 apps node or to have seperate APPL_TOPs for the 2 apps node. The point is that which will be benificitial in my case from business point of view. The INST_TOPS will be node specific in any case right?
    2) Where to enable the Concurrent Managers, in the DB node or in the primary apps node or in 2 apps node both for better performance.
    12.1.1 is installed in RHEL 5.3
    Thanks and Regards

    Hi,
    Please refer to (Note: 384248.1 - Sharing The Application Tier File System in Oracle E-Business Suite Release 12).
    For enabling the CM, it depends on what resources you have on each server. I would recommend you install it on the the application tier node, and leave the database installed on one server with no application services (if possible).
    Regards,
    Hussein

  • Help needed to record an experiment for a running process

    Hi Team,
    While trying to record an experiment through Profile Running process option, we found issues in generating an experiment file.The error stated that the directory wasn't writable but we made sure all the permissions are available for the folder. The application is a C++ implementation.
    Have attached the output message:
    Running: /x/opt/SolarisStudio12.4-beta_mar14-linux-x86/lib/analyzer/lib/../../../bin/collect -P 16824 -o test.1.er -d /x/web/STAGE2LP14/xxxx -p on -S on
    name test. is in use; changed to test.4.er
    Reading xxxx
    Reading ld-linux.so.2
    name test. is in use; changed to test.4.er
    Reading libppfaketime.so.1
    Reading librt.so.1
    Reading libpthread.so.0
    Reading libcrypt.so.1
    Reading libz.so.1
    Reading libdl.so.2
    Reading libkrb5.so.3
    Reading libicui18n.so.36
    Reading libicuuc.so.36
    Reading libicudata.so.36
    Reading libicuio.so.36
    Reading libexpat.so.0
    Reading libqpidmessaging.so.3
    Reading libqpidtypes.so.1
    Reading libxerces-c.so.27
    Reading libstdc++.so.6
    Reading libm.so.6
    Reading libc.so.6
    Reading libgcc_s.so.1
    Reading libk5crypto.so.3
    Reading libcom_err.so.2
    Reading libkrb5support.so.0
    Reading libkeyutils.so.1
    Reading libresolv.so.2
    Reading libqpidclient.so.6
    Reading libuuid.so.1
    Reading libselinux.so.1
    Reading libqpidcommon.so.6
    Reading libsepol.so.1
    Reading libboost_program_options.so.2
    Reading libboost_filesystem.so.2
    Reading libsasl2.so.2
    Reading ISO8859-1.so
    Reading libcollector.so
    Attached to process 16824
    t@4133656384 (l@16824) stopped in __kernel_vsyscall at 0xffffe410
    0xffffe410: __kernel_vsyscall+0x0010:    popl     %ebp
    Process ID: 12981
    dbx: The HW counter configuration could not be loaded
    Elapsed Time: 85 ms
    Run "collect -h" or "er_kernel -h" with no other arguments for more information on HW counters on this system.
    Execution completed, exit status is 0
    dbx: Creating experiment database /x/web/STAGE2LP14/xxxxxx/test.4.er (Process ID: 13736) ...dbx: Creating experiment database /x/web/STAGE2LP14/xxxxxx/test.4.er (Process ID: 13736) ...
    dbx: Experiment directory not writable
    Experiment aborted
    error at line 16 of file 'dbxcol3wC1XU'
    detaching from process 16824
    Even we tried manually using the collect command the process started successfully but while terminating the process using CTRL+ENTER we got coredump error
    f7f40000-f7f50000 rwxp f7f40000 00:00 0
    f7f50000-f7f6b000 r-xp 00000000 fd:00 589838                             /lib/ld-2.5.so
    f7f6b000-f7f6c000 r-xp 0001a000 fd:00 589838                             /lib/ld-2.5.so
    f7f6c000-f7f6d000 rwxp 0001b000 fd:00 589838                             /lib/ld-2.5.so
    ffbe7000-ffbfc000 rwxp 7ffffffe9000 00:00 0                              [stack]
    ffffe000-fffff000 r-xp ffffe000 00:00 0
    dbx: internal error: signal SIGABRT (sent by tkill)
    dbx's coredump will appear in /tmp
    We arent sure how to terminate the collect process manually.
    /x/opt/SolarisStudio12.4-beta_mar14-linux-x86/lib/analyzer/lib/../../../bin/collect -P 16824 -o test.1.er -d /x/web/STAGE2LP14/xxxx -p on -S on
    Please help us
    Thanks
    Sattish.

    Hi Darryl,
    We tried with the below mentioned option
    ./collect -P 24829 -o /tmp/test.9.er  But still the same error
    NOTE: No J2SE[tm] was specified for checking.
        The following J2SE[tm] versions are recommended:
          J2SE[tm] 1.7.0_25 or later 1.7.0 updates (preferred)
    NOTE: You can download and install the J2SE[tm] from http://www.oracle.com/technetwork/java/javase/downloads.
    WARNING: Java data collection may fail: J2SE[tm] version is unsupported.
    Reading atlasserv
    Reading ld-linux.so.2
    Reading libppfaketime.so.1
    Reading librt.so.1
    Reading libpthread.so.0
    Reading libcrypt.so.1
    Reading libz.so.1
    Reading libdl.so.2
    Reading libkrb5.so.3
    Reading libicui18n.so.36
    Reading libicuuc.so.36
    Reading libicudata.so.36
    Reading libicuio.so.36
    Reading libexpat.so.0
    Reading libqpidmessaging.so.3
    Reading libqpidtypes.so.1
    Reading libxerces-c.so.27
    Reading libstdc++.so.6
    Reading libm.so.6
    Reading libc.so.6
    Reading libgcc_s.so.1
    Reading libk5crypto.so.3
    Reading libcom_err.so.2
    Reading libkrb5support.so.0
    Reading libkeyutils.so.1
    Reading libresolv.so.2
    Reading libqpidclient.so.6
    Reading libuuid.so.1
    Reading libselinux.so.1
    Reading libqpidcommon.so.6
    Reading libsepol.so.1
    Reading libboost_program_options.so.2
    Reading libboost_filesystem.so.2
    Reading libsasl2.so.2
    Reading ISO8859-1.so
    Reading libcollector.so
    Attached to process 24829
    t@4133668672 (l@24829) stopped in __kernel_vsyscall at 0xffffe410
    0xffffe410: __kernel_vsyscall+0x0010:   popl     %ebp
    dbx: The HW counter configuration could not be loaded
    Run "collect -h" or "er_kernel -h" with no other arguments for more information on HW counters on this system.
    dbx: Creating experiment database /tmp/test.9.er (Process ID: 7769) ...
    dbx: Experiment directory not writable
    Experiment aborted
    error at line 15 of file 'dbxcol61PZeE'
    detaching from process 24829
    Could you please review
    Thanks
    Sattish.

  • Help needed with Mail (Version 3.6 (936)) and iCloud

    Hello, I‘ve got problems with synchronizing my emails with Mail (Version 3.6 (936)) and iCloud. I started to synchronize my iPhone 5 and iMac at work OS X 10.8. On my Power Mac at home I only receive older mails (latest is from 04/25 this year) and sometimes the same mail 7 times. When i try to quit Mail it doesn‘t work and I've to use the shortcut: +alt+shift+esc.
    I think that my Mail-Version doesn‘t work together with iCloud but perhaps there‘s a possibility to fix this problem with a trick.
    Thanks for your help
    Best regards
    Stefan

    To use iCloud you have to upgrade to at least Lion OS 10.7, which is not possible on a PPC Mac like yours.

  • HELP NEEDED: Use a new record for a customized table as a workflow trigger

    Hi SAP Workflow Gurus,
    Good day!
    One of our requirements is to have the creation of a new record in a custom database table trigger a workflow. I read and followed the steps as indicated in this link
    http://www.****************/Tutorials/Workflow/Table/events.htm
    As a summary, here is what I did:
    1. Created a subroutine for the custom table ZPA2003 for the create event
    2. Defined this event as a trigger for the custom workflow WS90000019
    3. Maintained and activated the event linkage via SWETYPV
    Basically I tried following the instructions indicated on the link
    Note that the entries for the custom table are created via a function module.
    Now during testing of the workflow, the system was able to complete the notification message. However, when I triggered the event via creation of the entry in the custom db table nothing happened
    1. Is there a way to debug/find out whether the link between the subroutine in the table and event itself are connected?
    2. I also tried running the event trace to check the activities executed but found nothing.
    Can you guys help me check on what I may have missed out?
    Regards,

    Hi All,
    Basically these are the requirements for the work schedule substitution:
    1. Employee enters the request via ESS (custom portal transaction)
    2. Upon saving of the entry (request) in a custom  table, this should trigger the workflow at the backend
    3. The Manager should then receive a work item in his Universal Worklist at the MSS side of the portal
    4. Upon clicking on the work item, a new screen will pop up showing the work schedule substitution details as well as an interface which will allow him to enter his/her usage decision
    5. Depending on the decision, the workflow must execute the necessary notifications as well as changes in the custom and PA2003 tables
    Now, we have accomplished step 1 and I am currently in the process for step 2. Now I have some queries:
    1. For the Manager to view and approve the substitution details, I used two methods for each process (view and approve) since our ABAPer mentioned that this cannot be done in 1 function module. Is there a way to simplify this step or is it really valid that they need to be executed in 2 different methods?
    2. As per our ABAPer, function Modules only import and export variables; they do not have the facility of say having tags on the details being displayed. Hence, if the details are 10001 (Employee Number), 10/20/2011 (Start Date), 10/25/2011 (End Date) the output would be 10001, 10/20/2011, 10/25/2011. Is there a way within the function module (or dynpro interface) to show it like this: Employee Number: 10001
                               Start Date: 10/20/2011
                               End Date: 10/25/2011
    3.  Speaking of dynpro applications, do I need to still develop one to allow the Manager to view and approve/reject the request via the MSS portal upon accessing the work item via the UWL? How would the work item go about calling the dynpro application? or is this even possible?
    It would have been easier if the facility would not pass through workflows since it will be just direct web dynpro/ABAP calls. Having to include it as a work item in the UWL puts a certain twist to it
    Your inputs are well-appreciated.
    Regards,

  • Help, need drivers and EC update for MSI GT 740 / 1727 Gaming notebook.

    I've searched high and low and have not been able to find drivers for the gt 740 / 1727 laptop.  I needed to rebuild and no longer have my driver disk.  I'm installing windows 7 x64. I was able to find and update the bios on the MSI site, but no EC update( mines from 2009)
    I have not been able to install the Invidia gts 250m drivers as well despite trying multiple versions from the nvidia site.   
    David

    Quote from: Svet on 24-November-11, 15:16:01
    is it barabone or retail?
    Its a barebones.
    David

Maybe you are looking for