SAP help-Efficient way to look

Hi all,
Iam really surprised by the way people post links to "sap help" when posting questions.
Is there an efficient and quick way to look the SAP-help when one is looking for special things.
I only got this far
http://help.sap.com/
Pressed the "SAP NetWeaver" link on the left side
pressed "English" documentation
Then SAP NetWeaver Library -clicked "Developers Guide"
I hope up to this poing Iam right,From here on If Iam looking for XI topics like message mapping or BPM;how should I proceed to find it...
Thanks
P
Edited by: pazzuzu on Nov 8, 2009 4:46 PM

I only got this far
http://help.sap.com/
Pressed the "SAP NetWeaver" link on the left side
pressed "English" documentation
Then SAP NetWeaver Library -clicked "Developers Guide"
everybody goes this far only ..... once you get on the above page you need to type in the words in the Search option so that you get to the related results.... it is on the Top-right portion of the page
In addition to normal search you can also opt for Advance Search option to filter your search area.
Please note that the search results will contain many additional (un-related) results, so you need to select the required one.
Regards,
Abhishek.

Similar Messages

  • Efficient way of saving documents uploaded by users

    Hello Experts,
    We are looking out for an efficient way of storing of documents uploaded by the users. Below is the explanation of our scenario in detail.
    We are working on the SAP E-Recruiting module and have designed custom Interactive Adobe Forms & Web Dynpro Components for the client. An end user (initiating manager) would fill in the form & upload file(s) in support of his statement & click on the "submit" button. As of now we are achieving the upload functionality through WDA's FileUpload ui element & dumping the file contents into a field of type RAWSTRING.
    We now want to change our approach of using the RAW_STRING field & go for a more efficient one. I have been googling around and read a bit about approaches like:
    Class CL_BDS_DOCUMENT_SET
    Class CL_FITV_GOS
    Function Module BDS_BUSINESSDOCUMENT_CREATEF
    But all these 3 approaches seem to be targetted towards some particular business object. They expect some sort of class name or the other to be passed on to them as input. However ours is a complete custom requirement wherein the entire forms data is going into a Z-table so none of these approaches would hold good. (Please correct me if I am wrong when I say that coz I haven't personally worked on any of them till date!) Awaiting your expert opinions on this matter.
    Regards,
    Uday
    Any ideas please?
    Edited by: Uday Gubbala on Mar 4, 2010 10:56 AM

    See if this sample code helps....we hav DMS ( Document management System )
    MOVE: 'DRW' TO DOCUMENTDATA-DOCUMENTTYPE,
            'C:\Users\sci30\Desktop\test.doc' TO
             DOCUMENTDATA-DOCFILE1,
            'TEST DESCRIPTION' TO  DOCUMENTDATA-DESCRIPTION.
    *          move 'WR' to documentdata-STATUSINTERN.
      MOVE 'WRD' TO DOCUMENTDATA-WSAPPLICATION2.
      CLEAR : WA_OBJ_LINK.
      MOVE 'MARA' TO WA_OBJ_LINK-OBJECTTYPE.
      MOVE HEADDATA-MATERIAL TO WA_OBJ_LINK-OBJECTKEY.
      APPEND WA_OBJ_LINK TO GT_OBJ_LINK.
      CALL FUNCTION 'BAPI_DOCUMENT_CREATE2'
        EXPORTING
          DOCUMENTDATA               = DOCUMENTDATA
    *         HOSTNAME                   =
    *         DOCBOMCHANGENUMBER         =
    *         DOCBOMVALIDFROM            =
    *         DOCBOMREVISIONLEVEL        =
    *         CAD_MODE                   = ' '
    *         PF_FTP_DEST                = ' '
    *         PF_HTTP_DEST               = ' '
    *         DEFAULTCLASS               = 'X'
        IMPORTING
    *         DOCUMENTTYPE               =
    *         DOCUMENTNUMBER             =
    *         DOCUMENTPART               =
    *         DOCUMENTVERSION            =
         RETURN                     =    RETURN_DOCUBAPI
       TABLES
    *         CHARACTERISTICVALUES       =
    *         CLASSALLOCATIONS           =
    *         DOCUMENTDESCRIPTIONS       =
          OBJECTLINKS                = GT_OBJ_LINK
    *         DOCUMENTSTRUCTURE          =
    *         DOCUMENTFILES              =
    *         LONGTEXTS                  =
    *         COMPONENTS                 =
      COMMIT WORK.

  • Most efficient way to apply Paragraph Style A to Paragraph style B?

    I'm looking for the most efficient way to all
    all aspects of an existing named paragraph style "A" to another named paragraph style "B". I'd especially like to find a keyboard-only way to do this.
    To my surprise,
    Copy Special >
    Paste doesn't seem to copy tab settings...(?) This leads me to suppose that other aspects of the source p'style may not be crossing the Great Paste Divide.
    Over the years I've used a variety of more or less clumsy multi-step, multi-tool "tricks" (including third-party plug-ins) to apply one named paragraph style to another, but knowing FrameMaker as I do I suspect there may be a truly efficient way to do what I want.
    Is there?
    Cheers & thanks,
    Riley

    Arnis:
    8.0p277.
    The Font family and style got Pasted -- that was immediately apparent. And based on all I know about F'Maker, I would've thought the tabs would go over.
    But when I opened the target p'style the Tabs area was completely blank.
    I'm not sure if anything else wasn't making it 'cross the Paste Divide: Once the Tabs weren't Pasted, I fell back to a different, brute-force strategy simply to get around the problem and return to work...
    Moreover, since the structured templates I inherited for this project are full of idiosyncrasies, I'll just file this one away under "The Templates Did It" and hope for better luck the next time I try the Copy Special thing...
    Cheers & thanks for your help,
    Riley

  • Efficient way of updating data to database table

    what is the efficient way of updating data to database table  .
    i have huge amount of data in my internal table  , how to use update statement in this case .
    1. database table having 20 fields  ,
    2. one is key field and suppose 20 th field i want to change  .
    3. I have data for only 2 fields , i.e for ( 1 st and last 20 th field ) .
    i can't use update statement in loop , as it is not good practice(hits database several times ) .
    do it effects all the 20 fields for  particular record .

    Hi,
    Use UPDATE statement , check below description from SAP help.
    UPDATE dbtab FROM TABLE itab. or UPDATE (dbtabname) FROM TABLE itab.
    Effect
    Mass update of several lines in a database table.Here, the primary key for identifying the lines tobe updated and the values to be changed are taken from the lines of theinternal table itab. 
    The system field SY-DBCNT contains the number of updated lines,i.e. the number of lines in the internal table itab which havekey values corresponding to lines in the database table.
    Regards
    L Appana

  • Most efficient way of changing still image frame rates?

    I'm using Premiere CS5.5 and have been having issues with the audio/video going out of sync when I upload my video to Vimeo (it's not out of sync prior to uploading it...it only happens after they transcode it). The Vimeo support staff says the problem is that I'm using a variable frame rate, but since there isn't any option (to my knowledge) to choose a constant or variable framerate in Premiere/Adobe Media Encoder, I think I have identified the problem. The primary footage that I shot with is 23.976 fps and that is what my sequence settings and output settings have been, but I also have a lot of still images that I'm using for b-roll, which I finally noticed are 29.97 fps. I changed the Intermediate Media Timebase setting to 23.976 fps in the preferences, but my question now is if there is an efficient way to change the frame rate of all those still images without having to re-import them and re-apply all the various motion effects/key framing that I've already spent a lot of time on?
    Any help would be much appreciated. Thanks.

    Yeah, I downloaded MediaInfo to explore some of the various versions that I've rendered in my effort to get something that works correctly on Vimeo, and depending on the encoder I saw some of what Ann posted, but I also looked at one on MediaInfo that I had rendered through Adobe Media Encoder and it had a Constant frame rate mode of 23.976, yet I remember watching that one on Vimeo at the time and it was still out of sync--Mediainfo says that version has a variable audio bit rate though, but to my knowledge there isn't an option in AME to choose a constant audio bit rate.
    I agree that it's an issue on Vimeo's side, but the chances of them getting their **** together in a timely fashion seem low. The frustrating thing is that I can watch documentary style videos on Vimeo that I know were edited and compressed through Premiere/AME, yet they don't have any audio sync issues. The difference that I noticed between one of those videos and mine is the use of still images in my video. As you can see from an image I uploaded earlier, the default frame rate for still images when I imported them was 29.97 until I changed it yesterday, but it didn't update the images I had already imported. So I'm wondering if there is a way to change the frame rate of those still images without having to re-import the images and make all my edits/key framing all over again.

  • Efficient way to collect  datasources connected to Infosources

    Hai  i need to <b>collect all the datasources</b> for a list of <b>info sources</b>. Is there any easy way to collect them ? Is there any efficient way to collect them from any tables or metadata? iam not looking for collecting them manually.
    I appreciate any help on this and would like to give as many points as possible.

    Hi,
    you can get such a list from table rsisosmap.
    kind regards
    Siggi

  • Efficient way get FCE4 Log and Transfer to read .mts files stored on drive?

    Hi All
    I've searched the FCE discussion forum and not found an answer verified by more than one user to this question: What is an efficient way to get FCE4 (via the Log and Transfer window) to see .mts files from an AVCHD camera stored on a drive (NOT via the camera -- directly from the drive)?
    I am trying to plan the most space-efficient system possible for storing un-transcoded .mts files from a Panasonic AG-HMC151 on a harddrive so that I can easily ingest them into FCE4. I am shooting a long project and I want to be able to look at .mts files so that I can decide which ones to transcode to AIC for the edit.
    Since FCE4 cannot see .mts files unless they have their metadata wrapper the question is really 'how do I most efficiently transfer .mts files from the camera to a storage harddrive with their metadata wrappers so that FCE4 can see them via the log and transfer window?'
    Nick Holmes, in a reply in this thread
    http://discussions.apple.com/thread.jspa?messageID=10423384&#10423384
    gives 2 options: Use the Disk Utility to make a disk image of the whole SD card, or copy the whole contents of the card to a folder. He says he prefers the first option because it makes sure everything on the card is copied.
    a) Have other FCE users done this successfully and been able to read the .mts files via Log and Transfer?
    In a response to this thread:
    http://discussions.apple.com/thread.jspa?messageID=10257620&#10257620
    wallybarthman gives a method for getting Log and Transfer to see .mts files that have been stored on a harddrive without their metadata wrappers by using Toast 9 or 10.
    b) Have any other FCE4 users used this method? Does it work well?
    c) Why is FCE4 unable to see .mts files without their metadata wrappers in the Log and Transfer window? Is it just a matter of writing a few lines of code?
    d) Is there an archiving / library app. on the market that would allow one to file / name / tag many .mts clips and view them prior to transcoding into space-hungry AIC files in FCE?
    Any/all help would be most gratefully received!

    I have saved the complete file structure on DVD as a backup, but have not needed to open them yet. But I will add this. As I understand the options with Toast you are infact converting the video to AIC or something like it. I haven't looked into it myself, but I can't imagine the extra files are that large, but maybe there are significant, I don't know. The transcoded files are huge in comparison to the AVCHD file.
    A new player on the scene for AVCHD is Clipwrap 2.0. As I understand this product. It rewraps the AVCHD into a wrapper the Quicktime can open and play. This is with the MTS files only, the rest of the file structure is not needed. The rewrap is much faster that the transcode to AIC. So you have the added benefit of being able to play the files as well as not storing the extra files. The 2.0 version (which is for AVCHD) was just recently released. I haven't tried it and don't personally know of anyone who has. You might want to try this, there is a trial version as I recall.

  • Creating a time channel in the data portal and filling it with data - Is there a more efficient way than this?

    I currently have a requirement to create a time channel in the data portal and subsequently fill it with data. I've shown below how I am currently doing it:
    Time_Ch = ChnAlloc("Time channel", 271214           , 1      ,           , "Time"         ,1                  ,1)              'Allocate time channel
    For intLoop = 1 to 271214
      ChD(intLoop,Time_Ch(0)) = CurrDateTimeReal          'Create time value
    Next
    I understand that the function to create and allocate memory for the time channel is extremely quick. However the time to store data in the channel afterwards is going to be highly dependent on the length I have assigned to the Time_Ch. In my application the length of Time_Ch is variable but could easily be in the order of 271214 or higher. Under such circumstances the time taken to fill Time_Ch is quite considerable. I am wondering whether this is the most appropriate way of doing things or whether there is a more efficient way of creating a time channel and filling it.
    Thanks very much for any help.
    Regards
    Matthew

    Hi Matthew,
    You are correct that there is a more efficient way to do this.  I'm a little confused about your "CurrDateTimeReal" assignment-- is this a constant?  Most people want a Time channel that counts up linearly in seconds or fractions of a second over the duration of the measurement.  But that looks like you would assign the same time value to all the rows of the new Time channel.
    If you want to create a "normal" Time channel that increases at a constant rate, you can use the ChnGenTime() function:
    ReturnValue = ChnGenTime(TimeChannel, GenTimeUnit, GenTimeXBeg, GenTimeXEnd, GenTimeStep, GenTimeMode, GenTimeNo)
    If you really do want a Time channel filled with all the same values, you can use the ChnLinGen() function and simply set the GenXBegin and GenXEnd parameters to be the same value:
    ReturnValue = ChnLinGen(TimeChannel, GenXBegin, GenXEnd, XNo, [GenXUnitPreset])
     In both cases you can use the Time channel you've already created (which as you say executes quickly) and point the output of these functions to that Time channel by using the Group/Channel syntax of the Time channel you created for the first TimeChannel parameter in either of the above functions.
    Brad Turpin
    DIAdem Product Support Engineer
    National Instruments

  • The most efficient way to search a large String

    Hi All,
    2 Quick Questions
    QUESTION 1:
    I have about 50 String keywords -- I would like to use to search a big String object (between 300-3000 characters)
    Is the most efficient way to search it for my keywords like this ?
    if(myBigString.indexOf("string1")!=1 || myBigString.indexOf("string2")!=1 || myBigString.indexOf("string1")!=1 and so on for 50 strings.)
    System.out.println("it was found");
    QUESTION 2:
    Can someone help me out with a regular expression search of phone number in the format NNN-NNN-NNNN
    I would like it to return all instances of that pattern found on the page .
    I have done regular expressions, in javascript in vbscript but I have never done regular expressions in java.
    Thanks

    Answer 2:
    If you have the option of using Java 1.4, have a look at the new regular expressions library... whose package name I forget :-/ There have been articles published on it, both at JavaWorld and IBM's developerWorks.
    If you can't use Java 1.4, have a look at the jakarta regular expression projects, of which I think there are two (ORO and Perl-like, off the top of my head)
    http://jakarta.apache.org/
    Answer 1:
    If you have n search terms, and are searching through a string of length l (the haystack, as in looking for a needle in a haystack), then searching for each term in turn will take time O(n*l). In particular, it will take longer the more terms you add (in a linear fashion, assuming the haystack stays the same length)
    If this is sufficient, then do it! The simplest solution is (almost) always the easiest to maintain.
    An alternative is to create a finite state machine that defines the search terms (Or multiple parallel finite state machines would probably be easier). You can then loop over the haystack string a single time to find every search term at once. Such an algorithm will take O(n*k) time to construct the finite state information (given an average search term length of k), and then O(l) for the search. For a large number of search terms, or a very large search string, this method will be faster than the naive method.
    One example of a state-search for strings is the Boyer-Moore algorithm.
    http://www-igm.univ-mlv.fr/~lecroq/string/tunedbm.html
    Regards, and have fun,
    -Troy

  • The efficient way to repair broken EPM report

    Hi experts,
    I'm looking for the efficient way to repair broken EPM report.
    In following cases, the relevant EPM reports are broken and I can't push "Edit report" (gray out) and lose connection to model...
    i) In removing dimension from a model (ex. rename dimension ID, add & delete it)
    ii) in removing member which is located on page axis of EPM report.
    To avoid the situation, I make a point of taking following measures:
    i) - clean up the dimension that I want to delete from axis of EPM report before remove it from relevant model
    ii) - clean up the member from EPM report before removing it.
    However these measures terribly bothered me if number of relevant EPM reports is too large... If these reports can connect to model or be able to "Edit report", the work will be more efficient.
    Do you mind if you tell me a good idea?
    Regards,
    Masa

    Hi Vadim,
    Here's the example in appearing issue ii).
    No i) is my misconception, sorry...
    in opening the broken EPM report:
    This dialog appears if the member that has already removed is located in pase axis. Unfortunately, this broken report isn't able to execute "Edit Report".
    My question is how I can repair the report by something to do. Please help me...
    Best regards,
    Masa

  • Efficiant way of drawing?

    Im looking for the most efficiant way to draw, my application requires drawing of images every 50m/s or so.
    What im doing is using a class which uses PixelGrabbar for a pixel array, this is left there to use so it dosn't have to load every single time.
    For example;
    public ImageClass image1 = new ImageClass("myimage");Then, there i have a class which has a set pixel array size, for example;
    public ScreenDrawing sd1 = new ScreenDrawing(400, 600, this);This uses the following code;
    public ScreenDrawing(int w, int h, Component component)
            drawingAreaHeight = h;
            drawingAreaWidth = w;
            drawingAreaSize = new int[w * h]; //pixel array
            drawingAreaImage = component.createImage(this);
            component.prepareImage(drawingAreaImage, this);
            component.prepareImage(drawingAreaImage, this);
            component.prepareImage(drawingAreaImage, this);
        }Basically, when i want to draw the whole screen image i would do this;
    sd1.drawPixels(x, graphics, y);This would draw the pixel array to the screen at x, y.
    To get the pixels set on the "screenDraw" class i would do;
    image1.setPixels(x, y);After this is set it paints it to the screen.
    public void drawGraphics(int xOffset, Graphics g, int yOffset)
            setImageConsumer();
            g.drawImage(drawingAreaImage, xOffset, yOffset, this);
    public synchronized void setImageConsumer()
            if(imageConsumer != null)
                imageConsumer.setPixels(0, 0, drawingAreaHeight, drawingAreaWidth, colorModel, drawingAreaSize, 0, drawingAreaHeight);
                imageConsumer.imageComplete(2);
        }This is basically everything i use.
    But i have a problem, when drawing an image, for example just the size, 96 x 96 pixels, this will just flicker the image instead of drawing it because of the speed it needs to be drawn to update with the image placed above of this image.
    If anyone knows any more efficiant way to draw please help.
    Thanks.

    If you are painting by using repaint(), then for an AWT component you need to override
    public void paint(Graphics g) {
         //call methods to do painting stuff
    }and override
    public void update(Graphics g) {
         paint(g);
    }to call paint. What's happening is that when you call repaint(), the AWT painting mechanism calls update(...) . This method in turn erases the background before calling paint(...) . The flickering is from seing the erasing.

  • Most efficient way to delete "removed" photos from hard disk?

    Hello everyone! Glad to have this great community to come to for help. I searched for this question but came up with no hits. If it's already been discussed, I apologize and would love to be directed to the link.
    My wife and I have been using LR for a long time. We're currently on version 4. Unfortunately, she's not as tech-savvy or meticulous as I am, and she has been unknowingly "Removing" photos from the LR catalogues when she really meant to delete them from the hard disk. That means we have hundreds of unwanted raw photo files floating around in our computer and no way to pick them out from the ones we want! As a very organized and space-conscious person, I can't stand the thought. So my question is, what is the most efficient way to permanently delete these unwanted photos from the hard disk
    I did fine one suggestion that said to synchronize the parent folder with their respective catalogues, select all the photos in "Previous Import," and delete those, since they will be all of the photos that were previously removed from the catalogue.
    This is a great suggestion, but it probably wouldn't work for all of my catalogues since my file structure is organized by date (the default setting for LR). So, two catalogues will share the same "parent folder" in the sense that they both have photos from May 2013, but if I synchronize May 2013 with one, then it will get all the duds PLUS the photos that belong in the other catalogue.
    Does anyone have any suggestions? I know there's probably not an easy fix, and I'm willing to put in some time. I just want to know if there is a solution and make sure I'm working as efficiently as possible.
    Thank you!
    Kenneth

    I have to agree with the comment about multiple catalogs referring to images that are mixed in together... and the added difficulty that may have brought here.
    My suggestions (assuming you are prepared to combine the current catalogs into one)
    in each catalog, put a distinctive keyword onto all the images so that you can later discriminate these images as to which particular catalog they were formerly in (just in case this is useful information later)
    as John suggests, use File / "Import from Catalog" to bring all LR images together into one catalog.
    then in order to separate out the image files that ARE imported to LR, from those which either never were / have been removed, I would duplicate just the imported ones, to an entirely separate and dedicated disk location. This may require the temporary use of an external drive, with enough space for everything.
    to do this, highlight all the images in the whole catalog, then use File / "Export as Catalog" selecting the option "include negatives". Provide a filename and location for the catalog inside your chosen new saving location. All the image files that are imported to the catalog will be selectively copied into this same location alongside the new catalog. The same relative arrangement of subfolders will be created there, for them all to live inside, as is seen currently. But image files that do not feature in LR currently, will be left behind by this operation.
    your new catalog is now functional, referring to the copied image files. Making sure you have a full backup first, you can start deleting image files from the original location, that you believe to be unwanted. You can do this safe in the knowledge that anything LR is actively relying on, has already been duplicated elsewhere. So you can be quite aggressive at this, only watching out for image files that are required for other purposes (than as master data for Lightroom) - e.g., the exported JPG files you may have made.
    IMO it is a good idea to practice a full separation of image files used in your LR image library, from all other image files. This separation means you know where it is safe to manage images freely using the OS, vs where (what I think of as the LR-managed storage area) you need to bear LR's requirements constantly in mind. Better for discrete backup, too.
    In due course, as required, the copied image files plus catalog can be moved bodily to another drive (for example, if they have been temporarily put on an external drive, and you want to store them on your main internal one again). This then just requires a single re-browsing of their parent folder's location, in order to correct LR's records inside this catalog, as to the image files' changed addresses.
    If you don't want to combine the catalogs into one, a similar set of operations as above, can be carried out for each separate catalog you have now. This will create a separate folder structure in each case, containing just those duplicated image files. Once this has been done for all catalogs, you can start to clean up the present image files location. IMO this is very much the laborious and inflexible option, so far as future management of the total body of images is concerned... though there may still be some overriding reason for working that way.
    RP

  • EFFICIENT way of escalating an open task

    I need to escalate TASKS that are still open after 31 days.
    I figure i need 2 workflows to do this.
    As i see it right now:
    1st WF. Waits for 31 days after the task has been created. On the 31st day it changes a read only field called "escalate" to YES.
    2nd WF checks for changes in tasks where: If (Status=OPEN AND escalate<>pre(escalate)) is true then send an escalete email or task.
    Is there a more efficient way of doing this?
    TIA
    Paul

    Is there a reason you want two worfklows? Why not put an e-mail action after the Wait on the same workflow? If you check the "Reevaluate Rule Conditions After Wait" checkbox on the Wait action, the workflow rule will be re-evaluated after your 31 days... so it would only send the e-mail message if the Task is still open (assuming your workflow condition is set to look at Status = Open).
    Chris

  • SQL query with multiple tables - what is the most efficient way?

    Hello I am learning PL/SQL. I have a simple procedure where I need to find number of employees and departments per location as per user input of location_id.
    I have 3 Tables:
    LOCATIONS
    location_id (pk)
    location_name
    DEPARTMENTS
    department_id (pk)
    location_id (fk)
    department_name
    EMPLOYEES
    employee_id (pk)
    department_id (fk)
    employee_name
    1 Location can have 0-MANY Departments
    1 Employee has 1 Department
    Here is the query I came up with for PL/SQL procedure:
    /*Ecount, Dcount are NUMBER variables */
    SELECT SUM (EmployeeCount), COUNT(DepartmentNumber)
         INTO Ecount, Dcount
         FROM     
         (SELECT COUNT(employee_id) EmployeeCount, department_id DepartmentNumber
              FROM employees
              GROUP BY department_id
              HAVING department_id IN
                        (SELECT department_id
                        FROM departments
                        WHERE location_id = userInput));
    I do get the correct result, but I am just wondering if my query is on the right track and if there is a more "efficient" way of doing this.
    Thanks in advance for helping a newbie out.

    Hi,
    Welcome to the forum!
    Something like this will be more efficient:
    SELECT    COUNT (employee_id)               AS ECount
    ,       COUNT (DISTINCT department_id)     AS DCount
    FROM       employees
    WHERE       department_id IN (     SELECT     department_id
                        FROM      departments
                        WHERE      location_id = :userInput
    ;You should also try a join instead of the IN subquery.
    For efficiency, do only the things you need to do.
    For example, you don't need a count of employees in each department, so don't compute one. That means you won't need the in-line view, so don't have one.
    You don't need PL/SQL for this job, so don't use PL/SQL if you don't have to. (I realize this question was out of context, so you may have good reasons for doing this in PL/SQL.)
    Do all filtering as early as possible. Don't waste effort computing things that won't be used .
    A particular example of this is: Never use a HAVING clause when you can use a WHERE clause. What's the difference between a WHERE clause and a HAVING clause? The WHERE clause is applied before aggregate functions are computed, and the HAVING clause is applied after; there's no other difference. Therefore, if the HAVING clause isn't referencing an aggregate function, it could be done in a WHERE clause instead.

  • Usage of CRM_ORDER_MAINTAIN SAP Help Link and Suggestions Required

    This is regarding CRM_ORDER_MAINTAIN fm
    Currently we are using function module crm_order_maintain outside Loop i.e calling once for all the contracts as sap crm performance standard ,
    Is there any standard documentation where we can find that states CRM_ORDER_MAINTAIN should be only used outside loop,
    As some are telling us to use in Loop Endloop because of timeout error in background job with contract having 180 line items,Which i feel that this create unneccessary load on execution (Not Recommendable)
    Please share any comments and suggestion this and also provide me the SAP Help Link on this

    Hi,
    for example to you, look for FM CRM_LRP_APPOINTMENTS_GET_DIAG which using CRM_ORDER_MAINTAIN and parameter  it_appointment.
    Hope this helps.
    Regards, Kostya Khveshchenik

Maybe you are looking for

  • I can no longer open any pdf files.

    As of yesterday, pdf's don't open!  I've tried the latest Adobe Reader, Preview, and Safari. Never had this problem before.  I've reinstalled Reader and restarted my computer (running OSX 10.8.3), no help.  This problem is true for my new and old pdf

  • Why Songs duplicated - downloaded and shown as itunes in the clouds

    Just downloaded iTunes 11. Approx 400 songs are shown with the cloud. I understand that it represents songs bought on other devices but not yet downloaded on the PC. But thats not the case.Each of the songs had been previously downloaded and therefor

  • Oracle Text Example

    Can someone post a quick example of an Oracle Text query?

  • Group by not working for me

    say I had a list of part numbers: 0104589 0119493 0119787 0134490 0245849 0204056 0287694 0294886 0308956 0388690 0406990 0588006 0590880 but I just wanted a list of the first two numbers using the query: <cfquery name="qrySeriesList" datasource="erp

  • Adding GIS Control to the WPF Agentry Client

    Hi, I am trying to set up the WPF Client with the new Work Manager 6.1 for SAP. In the "SAP Work Manager 6.1 SDK Installation" guide, chapter 2.4, it says Copy the contents of GIS SDK for WPF from the Mobile Apps SDK bundle into the folder where the