Optimize Report 9i

How to optimize a report on Developer 9i?

Could you be more specific?
I would say the best way is to make sure the queries run well outside of Report Builder. If they are slow there, they are going to appear to run even slower in a report, because Reports must format your data based on the frames, and this will add processing time.
Also, only use a data link when absolutely necessary. This causes extra trips to the database.
That was a very general question - But I would start and make sure the queries and any PL/SQL formulas are finely tuned, and that's more than half the battle right there!

Similar Messages

  • Optimize report script having many Link statements

    Hi,
    I have a report script in Essbase which uses around 10 <Link statements to extract level 1 entity members from decendants of specific entity members and around 8 <Link statements to extract level 0 account members from decendants of specific account members. I observed that this report script takes almost 10 to 12 hours to execute. After some investigation I found out that ideally a report script should have maximum 5 <Link statements defined; in case if there are more than 5 <Link statements the report script would run very slow.
    Following is excerpt from my report script:
    <PAGE ("Scenario", "Year")
    <ROW("Entity", "Account", "Custom1", "Custom2")
    <COLUMN("Period")
    For Entity:
    <LINK((<LEV("Entity","Lev1,Entity")) AND (<IDESC("Ent001")))
    <LINK((<LEV("Entity","Lev1,Entity")) AND (<IDESC("Ent002")))
    <LINK((<LEV("Entity","Lev1,Entity")) AND (<IDESC("Ent003")))
    For Account:
    Link ((<LEV("Account", "Lev0,Account")) AND (<IDESC("Acc001")))
    Link ((<LEV("Account", "Lev0,Account")) AND (<IDESC("Acc002")))
    Link ((<LEV("Account", "Lev0,Account")) AND (<IDESC("Acc003")))
    Could you please help me to fine tune/optimize this report script and suggest an alternate way to write this script without <Link statements so that I could run this report script faster?
    Thanks in advance!
    AK

    You state 10 but show six. Which is it?
    As for performance, if you make it just one Entity dimension LINK statement, is it fast?
    Following on that, have you tried breaking the report into multiple report scripts? If they are individually faster, you could just concatenate the output files through an OS batch script. The thought of 10 individual report scripts (although I suppose we are potentially talking about more than that but I can't tell based on the information you've provided) could be complicated, but it might be worthwhile from a prototyping perspective.
    Have you tried joining them together with a "!" statement. <--What that means is you write a report script and terminate it with a !. Then write another, similar report script and stick a ! on it. And again, and again, etc. Essbase reads to the ! and then goes to the next report script.
    Regards,
    Cameron Lackpour

  • BEx Analyzer: How to optimize reporting over a hierarchy?

    Hello there,
    using BEx Analyzer & Excel, we're reporting over a large hierarchy of cost centers, beginning at the top level and then looping over 166 different cost centers in that hierarchy.
    Looping works from within a macro, calling the SAPBEXSetFilterValue for the corresponding query.
    Reporting takes about 5 hours at the moment. Is there any way to speed up the calculation after setting the cost center as a filter value? Example: we need to report the top level node anyway, so the query needs to calculate over all hierarchy nodes. What happens to the results of the lower level nodes? Are they just dropped and calculated again later when the cost center of that lower level node is set as filter value? It would be a good thing if BW could cache these results so that it just needs to download them when needed later instead of calculating again.
    Any idea? I don't now very much of BW concerning that point, so any hint is very welcome. Our release is BW 3.1.
    We already tried RSRT but all we could do is to set the behaviour in case of query navigation. I'm not quite sure if this covers the case we want (setting the new filter value). At least, execution of that report still takes about 5 hours.
    Thank you very much.

    Andreas,
    Create Aggregates on Hierarchy level.
    Use Read mode H.
    Make use of OLAP CACHE.
    Have a look at RSRCACHE.
    Regards,
    Ramkumar Ghattamaneni.

  • Discoverer Report Optimization

    Hi
    Im a newbiew at using this forum, my question is what is the first thing to look for when you're doing optimization (reports that are running too long)? I've tried isolating where it is slowing down and so far I've discovered that it's coming from one of the conditions.
    Functions used:
    Decode
    NVL
    Round

    I have a custom folder with a query and I want to run it passing parameters (using the methos explained in note 282249.1).
    The parameters are 20061001000000 and 20061031999999 and I want them to apply to a numeric column wich contains a date ) and a number (ex. 20061001123456).
    If I run the query in SqlNavigator with this instruction
    where rdoc between 20061001000000 and 20061031999999
    the query returns all rows (1210) in 15 seconds;
    If I run the custom folder in Discoverer with
    where rdoc between PKG_AC_SETPARAM.GET_PARAM1 and PKG_AC_SETPARAM.GET_PARAM2
    the query takes between 4 and 5 minutes.
    The question is:
    1) What causes this so different performance
    2) How can I solve this problem?

  • ASK THE EXPERTS - WAAS MONITORING AND REPORTING

    Welcome to the Cisco Networking  Professionals Ask the Expert conversation. This is an opportunity to learn about Cisco Wide Area Application Services monitoring and reporting with Michael Holloway and Joe Merrill.  Michael is an escalation support engineer in the Application  Delivery Business Unit focusing on escalations to engineering related to  the Cisco Wide Area Application Services (WAAS) product. He has worked  with Cisco WAAS since its initial development, and with the first  product beta.
    /* Style Definitions */
    table.MsoNormalTable
    {mso-style-name:"Table Normal";
    mso-tstyle-rowband-size:0;
    mso-tstyle-colband-size:0;
    mso-style-noshow:yes;
    mso-style-priority:99;
    mso-style-qformat:yes;
    mso-style-parent:"";
    mso-padding-alt:0in 5.4pt 0in 5.4pt;
    mso-para-margin:0in;
    mso-para-margin-bottom:.0001pt;
    mso-pagination:widow-orphan;
    font-size:11.0pt;
    font-family:"Calibri","sans-serif";
    mso-ascii-font-family:Calibri;
    mso-ascii-theme-font:minor-latin;
    mso-fareast-font-family:"Times New Roman";
    mso-fareast-theme-font:minor-fareast;
    mso-hansi-font-family:Calibri;
    mso-hansi-theme-font:minor-latin;
    mso-bidi-font-family:"Times New Roman";
    mso-bidi-theme-font:minor-bidi;}
    Joe Merrill is an escalation support engineer in the Application Delivery Business Unit focusing on escalations to engineering related to the Cisco Wide Area Application Services (WAAS) product. He has worked with Cisco WAAS since its initial development, and with the first product beta.
    Remember to use the rating system to let Michael and Joe know if you have received an adequate response.
    Michael and Joe might not be able to answer each question due to the volume expected   during this event. Our moderators will post many of the unanswered   questions in other discussion forums shortly after the event. This  event  lasts through August 27, 2010. Visit this forum often to view  responses  to your questions and the questions of other community  members.

    Very good questions. Let me try and take them one at a time. Some of the answers you will likely find in the CM GUI help (upper-left corner is the Help button), or in the online documentation. But let's add a little more color and detail.
    1)When we pull bandwidth Optimization report, on Y-Axis the graphs says Effective Capacity .What is Effective Capacity?
    Basically, the "effective increased bandwidth capacity" is telling you how much additional WAN bandwidth you've gained because of the optimization. It will chart somewhere between 1 times and 100 times. Typically it charts all traffic, though you can configure it to chart traffic for specific Applications.
    The CDM online help gives the formulas used to chart the graph:
    Effective WAN Capacity = 1 / (1-% Reduction Excluding Pass-Through)
    % Reduction Excluding Pass-Through = (Original Excluding Pass-Through - Optimized) / (Original Excluding Pass-Through)
    2)what is reduction % excluding and including passthrough
    Looking at the formulas given above might help you understand. The one is a reduction ratio compared to only the original traffic that is optimized. The other is a reduction ratio compared to all original traffic, whether it is optimized or not. So, if you want to know what kind of optimization you are getting for the traffic that you configured to have optimized, look at the "excluding pass-through" numbers. If you want to know the positive effect that optimization is having on your full traffic load, take a look at the "including pass-through" numbers.
    3)What is effectivity capacity including and excluding passthrough ?
    The effective capacity is what kind of throughput you can potentially realize on the WAN -- assuming you would fill it to 100% capacity -- because of the level of optimization you are seeing. The "including" numbers show you the effect of optimization compared to all the traffic passing through the WAE whether it is optimized or not. The "excluding" numbers show the effect of optimization compared only to the traffic that is receiving optimizations.
    4)With the help of which report, we can show the customer that the file download which took 10 mins in first attempt, is downloaded in 10secons in next attempt?
    This one is a little trickier. The reports are much broader than a single connection. They are for all traffic, or for traffic that matches specific defined Applications. You could create a separate Application and matching classifiers for the client and/or server IP addresses and/or ports, run the test, then configure the charts to only show you the data for that Application. By default, statistics for an Application aren’t charted unless you check the "Enable Statistics" box when defining/editing the Application.
    5)How to show that the bandwidth utilization has decreased by which %.
    You want to look at the % reduction numbers you asked about in #2 above.
    6)Which report says that the applications have become this much time faster?
    These questions are normally put forwarded by many customers ? Can you please help me with your expertise answer ?
    This is probably the hardest question to answer.
    "Faster" isn't always easy to define. You are probably talking about user experience rather than statistics found in a network device. What determines that experience? A web page fully populating with all the pictures? A CIFS-based application that saves a file? A custom application that collects data from different servers over different protocols to perform some operation? Much of that is subjective and based on multiple individual requests, sometimes over different protocols.
    What we can provide are statistics to show the effect of WAN optimization and application acceleration for specific types of traffic. We can't show you that displaying a web page is N times faster with WAAS, because we don't know which of all the many HTTP requests that are made are specific to the user experience. But we can show that each of the requests received so much overall optimization, so much optimization from DRE, so much optimization from LZ, so much added benefit from HTTP acceleration.
    What you would probably do is collect some base-line timings for performing certain user activities, then perform the same operations both cold (first pass) and warm (subsequent passes). Back up those timing numbers with reports from the CM GUI, and perhaps even the "show statistics connection connection-id ". Which reports to use? Start with those Optimization and Acceleration reports. Those are the reports we expect will give the most complete/accurate pictures of the benefit of WAAS. You can also create and even schedule custom reports as needed.

  • Website Optimization Software Question

    If I drop a published folder from iWeb into Rage WEBCRUSHER is the folder now
    optimized such that I could use it with any kind of FTP transport ?
    .................or do I need to export the file from WEBCRUSHER in order to consummate the optimization?
    The reason I ask is that the published folder (prior to optimization) is 17MB.
    While the optimization report says that WEBCRUSHER saved 7.36 MB of 15.66 MB,
    GETINFO on the computer says the folder is still 17+MB in size)
    Do I need to use the optimization software to upload the file or can I just use it to optimize
    then transport it with a different FTP?
    The reason I ask is because when I upload the file via WEBCRUSHER I get an error message saying
    "tcp error, connection dropped".
    I am able to successfully upload with FTP Transport but I cannot determine whether or not
    what I am viewing is an "optimized" file or just a regular file.

    Apptorial,
    Here it is: http://testidea.net/Seattlesbestkitchens/Home.html
    Some of the links have been knitted together.
    The interim goal was to just test download speed and image quality.
    Thanks a lot!

  • Free resources that ensure the health and optimization of your site/s.

    I thought that I should share two resources that I found priceless.
    1. To check whether your site doesn't have any malware or hackers. These folks will check and clean your site for you for free:
         http://sitecheck.sucuri.net/scanner/
    2. To check for optimization tips and places where you can shed some weight off your site visit.
         http://gtmetrix.com/dashboard.html
    Feel free to suggest anything else that you might know of.

    Check spelling and grammar.
    Validate CSS & HTML code.
    To ensure menus, links and widgets all work, fully test your site in the 5 major browsers. If you must support older IE, get IE Tester.
    http://www.my-debugbar.com/wiki/IETester/HomePage
    Use Firefox's web developer toolbar.
    https://addons.mozilla.org/en-US/firefox/addon/web-developer/
    Web Page Analyzer & Optimization report
    http://www.websiteoptimization.com/services/analyze/
    Set-up Google WebMaster Tools and Google Analytics.
    http://www.google.com/webmasters/
    http://www.google.com/analytics/
    Nancy O.
    Alt-Web Design & Publishing
    Web | Graphics | Print | Media  Specialists 
    http://alt-web.com/
    http://twitter.com/altweb

  • Cannot print second or third report to printer, only to window

    Hi everybody,
    We have an app in VB6 and have just migrated from CRXI to CRXIR2 SP6.
    There is a form where the user can print a main report, and optionally other 2 reports that we call "attachments". They are all independent RPT files.
    When the user chooses to send the main report and the attachments to the window, the app calls the viewer and we can see the 3 or more windows, so we can print.
    But, when the user chooses to send all the rpt printings directly to the printer, the app passes through the 3 print calls (I could see via debug), but only the first is actually printed.
    This is the call:
                          rpt.PrintOut False
    My question: Is there something we have to do between the print calls?
    What I am doing is only setting the new RPT and its properties, like that:
                       Set rpt = applic.OpenReport(PathRPT + "SEFoto" + TipoRPT + ".rpt")
                       rpt.FormulaSyntax = crCrystalSyntaxFormula
                       rpt.FormulaFields.GetItemByName("Empresa").Text = "'" + Company + "'"
    etc...
    Note: The rpt object is defined on the module level:
    Public applic As New CRAXDRT.Application
    Public rpt As CRAXDRT.report
    Thanks for any help!
    Isis
    Brazil

    You won't believe what was the problem with my app...
    I tried all the suggestions Ludek gave, including making a sample app very simple with the 3 printout calls, and the result was the same, only the first report went to the printer...
    So, after many tests, I concluded that the problem must be in the RPTs, not in the app!
    Then I started to compare the 3 reports, checking everything that could be different and could cause this strange behavior.
    One difference that I found was the option "No Printer" in the "page setup" dialog box.
    The first report had the "No Printer" unselected, and the 2nd and 3rd report had the option "No Printer" selected!
    So I unselected the option, and the reports were printed!!! Halleluiaaa!!!!
    I remember that maybe this option was selected when I created these reports because I had some reports in the past that could never be printed directly to the printer, they were always directed to PDFCreator, that maybe was my default printer when I created the report. In these cases, setting the "No Printer" option solved this problem!
    But now, searching in the help file about this option, I get the explanation that is:
    "Select the No Printer check box when printing is not an option from a particular workstation"
    Is it correct?
    When we are on the Page Setup screen, what we read is No Printer (optimize for screen display).
    So we understand that it is a good option, that will optimize report view, I could never think that it would cause the report not to go to the printer...
    Anyway, thanks for all help and attention,
    Isis

  • In LO Cockpit, Job contol job is cancelled in SM37

    Dear All
       I am facing one problem. Pl help me to resolve that issue.
    When ever I am scheduling the delta job for 03 (Application Component), that is cancelled in SM37. Hence, I couldn't retrive those data from queued delta MCEX03 (smq1 or lbwq) to RSA7 because of job is cancelled. When I have seen that job log I got below Runtime Errors. Please hep me to resolve this issue
    Runtime Errors         MESSAGE_TYPE_X
    Date and Time          04.10.2007 23:46:22
    Short text
    The current application triggered a termination with a short dump.
    What happened?
    The current application program detected a situation which really
    should not occur. Therefore, a termination with a short dump was
    triggered on purpose by the key word MESSAGE (type X).
    What can you do?
    Note down which actions and inputs caused the error.
    To process the problem further, contact you SAP system
    administrator.
    Using Transaction ST22 for ABAP Dump Analysis, you can look
    at and manage termination messages, and you can also
    keep them for a long time.
    Error analysis
    Short text of error message:
    Structures have changed (sy-subrc=2)
    Long text of error message:
    Technical information about the message:
    Message class....... "MCEX"
    Number.............. 194
    Variable 1.......... 2
    Variable 2.......... " "
    Variable 3.......... " "
    Variable 4.......... " "
    How to correct the error
    Probably the only way to eliminate the error is to correct the program.
    If the error occures in a non-modified SAP program, you may be able to
    find an interim solution in an SAP Note.
    If you have access to SAP Notes, carry out a search with the following
    keywords:
    "MESSAGE_TYPE_X" " "
    "SAPLMCEX" or "LMCEXU02"
    "MCEX_UPDATE_03"
    If you cannot solve the problem yourself and want to send an error
    notification to SAP, include the following information:
    1. The description of the current problem (short dump)
    To save the description, choose "System->List->Save->Local File
    (Unconverted)".
    2. Corresponding system log
    Display the system log by calling transaction SM21.
    Restrict the time interval to 10 minutes before and five minutes
    after the short dump. Then choose "System->List->Save->Local File
    (Unconverted)".
    3. If the problem occurs in a problem of your own or a modified SAP
    program: The source code of the program
    In the editor, choose "Utilities->More
    Utilities->Upload/Download->Download".
    4. Details about the conditions under which the error occurred or which
    actions and input led to the error.
    Thanks in advance
    Raja

    LO EXTRACTION:
    First Activate the Data Source from the Business Content using “LBWE”
    For Customizing the Extract Structure – “LBWE”
    Maintaining the Extract Structure
    Generating the Data Source
    Once the Data Source is generated do necessary setting for
    Selection
    Hide
    Inversion
    Field Only Know in Exit
    And the save the Data Source
    Activate the Data Source
    Using “RSA6” transport the Data Source
    Replicate the Data Source in SAP BW and Assign it to Info source and Activate
    Running the Statistical setup to fill
    the data into Set Up Tables
    Go to “SBIW” and follow the path
    We can cross check using “RSA3”
    Go Back to SAP BW and Create the Info package and run the Initial Load
    Once the “Initial delta” is successful before running “delta” load we need to set up “V3 Job” in SAP R/3 using “LBWE”.
    Once the Delta is activated in SAP R/3 we can start running “Delta” loads in SAP BW.
    Direct Delta:- In case of Direct delta LUW’s are directly posted to Delta Queue (RSA7) and we extract the LUW’s from Delta Queue to SAP BW by running Delta Loads. If we use Direct Delta it degrades the OLTP system performance because when LUW’s are directly posted to Delta Queue (RSA7) the application is kept waiting until all the enhancement code is executed.
    Queued Delta: - In case of Queued Delta LUW’s are posted to Extractor queue (LBWQ), by scheduling the V3 job we move the documents from Extractor queue (LBWQ) to Delta Queue (RSA7) and we extract the LUW’s from Delta Queue to SAP BW by running Delta Loads. Queued Delta is recommended by SAP it maintain the Extractor Log which us to handle the LUW’s, which are missed.
    V3 -> Asynchronous Background Update Method – Here by seeing name itself we can understand this. I.e. it is Asynchronous Update Method with background job.
    Update Methods,
    a.1: (Serialized) V3 Update
    b. Direct Delta
    c. Queued Delta
    d. Un-serialized V3 Update
    Note: Before PI Release 2002.1 the only update method available was V3 Update. As of PI 2002.1 three new update methods are available because the V3 update could lead to inconsistencies under certain circumstances. As of PI 2003.1 the old V3 update will not be supported anymore.
    a. Update methods: (serialized) V3
    • Transaction data is collected in the R/3 update tables
    • Data in the update tables is transferred through a periodic update process to BW Delta queue
    • Delta loads from BW retrieve the data from this BW Delta queue
    Transaction postings lead to:
    1. Records in transaction tables and in update tables
    2. A periodically scheduled job transfers these postings into the BW delta queue
    3. This BW Delta queue is read when a delta load is executed.
    Issues:
    • Even though it says serialized , Correct sequence of extraction data cannot be guaranteed
    • V2 Update errors can lead to V3 updates never to be processed
    Update methods: direct delta
    • Each document posting is directly transferred into the BW delta queue
    • Each document posting with delta extraction leads to exactly one LUW in the respective BW delta queues
    Transaction postings lead to:
    1. Records in transaction tables and in update tables
    2. A periodically scheduled job transfers these postings into the BW delta queue
    3. This BW Delta queue is read when a delta load is executed.
    Pros:
    • Extraction is independent of V2 update
    • Less monitoring overhead of update data or extraction queue
    Cons:
    • Not suitable for environments with high number of document changes
    • Setup and delta initialization have to be executed successfully before document postings are resumed
    • V1 is more heavily burdened
    Update methods: queued delta
    • Extraction data is collected for the affected application in an extraction queue
    • Collective run as usual for transferring data into the BW delta queue
    Transaction postings lead to:
    1. Records in transaction tables and in extraction queue
    2. A periodically scheduled job transfers these postings into the BW delta queue
    3. This BW Delta queue is read when a delta load is executed.
    Pros:
    • Extraction is independent of V2 update
    • Suitable for environments with high number of document changes
    • Writing to extraction queue is within V1-update: this ensures correct serialization
    • Downtime is reduced to running the setup
    Cons:
    • V1 is more heavily burdened compared to V3
    • Administrative overhead of extraction queue
    Update methods: Un-serialized V3
    • Extraction data for written as before into the update tables with a V3 update module
    • V3 collective run transfers the data to BW Delta queue
    • In contrast to serialized V3, the data in the updating collective run is without regard to sequence from the update tables
    Transaction postings lead to:
    1. Records in transaction tables and in update tables
    2. A periodically scheduled job transfers these postings into the BW delta queue
    3.This BW Delta queue is read when a delta load is executed.
    Issues:
    • Only suitable for data target design for which correct sequence of changes is not important e.g. Material Movements
    • V2 update has to be successful
    Direct Delta: With this update mode, the extraction data is transferred with each document posting directly into the BW delta queue. In doing so, each document posting with delta extraction is posted for exactly one LUW in the respective BW delta queues.
    Queued Delta: With this update mode, the extraction data is collected for the affected application instead of being collected in an extraction queue, and can be transferred as usual with the V3 update by means of an updating collective run into the BW delta queue. In doing so, up to 10000 delta extractions of documents for an LUW are compressed for each Data Source into the BW delta queue, depending on the application.
    Non-serialized V3 Update: With this update mode, the extraction data for the application considered is written as before into the update tables with the help of a V3 update module. They are kept there as long as the data is selected through an updating collective run and are processed. However, in contrast to the current default settings (serialized V3 update), the data in the updating collective run are thereby read without regard to sequence from the update tables and are transferred to the BW delta queue.
    V1 - Synchronous update
    V2 - Asynchronous update
    V3 - Batch asynchronous update
    These are different work processes on the application server that takes the update LUW (which may have various DB manipulation SQLs) from the running program and execute it. These are separated to optimize transaction processing capabilities.
    Taking an example -
    If you create/change a purchase order (me21n/me22n), when you press 'SAVE' and see a success message (PO.... changed...), the update to underlying tables EKKO/EKPO has happened (before you saw the message). This update was executed in the V1 work process.
    There are some statistics collecting tables in the system which can capture data for reporting. For example, LIS table S012 stores purchasing data (it is the same data as EKKO/EKPO stored redundantly, but in a different structure to optimize reporting). Now, these tables are updated with the txn you just posted, in a V2 process. Depending on system load, this may happen a few seconds later (after you saw the success message). You can see V1/V2/V3 queues in SM12 or SM13.
    V3 is specifically for BW extraction. The update LUW for these is sent to V3 but is not executed immediately. You have to schedule a job (e.g. in LBWE definitions) to process these. This is again to optimize performance.
    V2 and V3 are separated from V1 as these are not as real time critical (updating statistical data). If all these updates were put together in one LUW, system performance (concurrency, locking etc) would be impacted.
    Serialized V3 update is called after V2 has happened (this is how the code running these updates is written) so if you have both V2 and V3 updates from a txn, if V2 fails or is waiting, V3 will not happen yet.
    hope it will helps you.....

  • Pls answer the following questions very urgent.

    Hi Experts & Friends,
    Can comeone please answer the following questions? If possible try to give some explanation.
    1. Why we use ANALYSIS PROCESS DESIGNER??
    2. What are web templates?
    3. How the connection from SRM SOURCE SYSTEM done?
    4. How do u get data frm SQL server thru DB Connect.
    5. Process chain, two targets frm same datasource if monthly & daily
    6.how u display the scanned & unscanned pdt ?? thru variable replacement type?
    Thanks & Regards
    Siri

    Hi Siri,
    1. Analysis Process designer
    The Analysis Process Designer (APD) is the application environment for the SAP data mining solution.
    The APD workbench provides an intuitive graphical interface that enables you to visualize, transform, and deploy data from your business warehouse. It combines all these different steps into a single data process with which you can easily interact.
    Use APD to pre-process your data:
    ? Read data from different sources and write it to a single location
    ? Transform data to optimize reporting
    ? Ensure high data quality by monitoring and maintaining the information stored in your data warehouse.
    The APD is able to source data from InfoProviders such as InfoCubes, ODS objects, InfoObjects,
    database tables, BW queries, and flat files. Transformations include joins, sorts, transpositions, and with
    BW 3.5, integration with BW?s Data Mining Workbench (RSDMWB).
    Check here
    https://www.sdn.sap.com/irj/servlet/prt/portal/prtroot/docs/library/uuid/96939c07-0901-0010-bf94-ac8b347dd541
    http://help.sap.com/saphelp_nw04/helpdata/en/49/7e960481916448b20134d471d36a6b/frameset.htm
    http://help.sap.com/saphelp_nw2004s/helpdata/en/39/e45e42ae1fdc54e10000000a155106/frameset.htm
    and service.sap.com/bi
    https://websmp206.sap-ag.de/~form/sapnet?_SHORTKEY=01100035870000585703&
    2. Web Templates
    Web Template
    Use
    This Web item can be used to manage consistent sections of different Web templates centrally within one Web template, which you can then insert into any Web template as required. In this way, you can define a header or footer section with the corporate logo and heading as a Web template and can integrate this Web template into your Web applications as a Web Template Web item. This Web template is then inserted during runtime. In contrast to HTML frame technology, the system does not generate a new page during this process. The context of the main template thus remains the same. In this way, you can display text elements and so on from data providers for the main template in the inserted Web template.
    Check....here....
    http://help.sap.com/saphelp_nw04/helpdata/en/69/5f8e9346c1244ea64ab580e2eea8b9/frameset.htm
    3.Srm Source system Connection
    2.3     SAP BW
    2.3.1     Define Client Administration
    Use
    This activity defines changes and transports of the client-dependent and client-independent objects.
    Procedure
    1.     To perform this activity, choose one of the following navigation options:
    SAP BW Role Menu     Local Settings ® Define Client Administration
    Transaction Code     SCC4
    SAP BW Menu     Tools ? Administration ? Administration ? Client Administration ? Client Maintenance
    2.     Switch to change mode.
    3.     Select your client.
    4.     Choose details.
    5.     In field Currency enter the ISO-code of the local currency, e.g. USD or EUR.
    6.     In field Client Role enter Customizing
    7.     Check the settings for changes and transport of client-specific objects and client-independent object changes
    If you want to use the settings made by BC-Sets or manually in other systems (other than BW), ?Automatic recording of changes? and ?Changes to Repository object and cross-client Customizing allowed? is required.
    Result
    Client administration has been defined to support the installation using Best Practices.
    2.3.2     Defining a Logical System for SAP BW (SAP BW)
    Use
    In this step, you define the logical systems in your distributed system.
    Prerequisites
    Logical systems are defined cross-client. Therefore cross-client customizing must be allowed in your client  (this can be checked in transaction SCC4).
    Procedure
    To carry out the activity, choose one of the following navigation options:
    SAP BW Role Menu     Defining a Logical System for SAP BW (SAP BW)
    Transaction Code     SPRO
    IMG Menu     SAP Reference IMG ? SAP Customizing Implementation Guide ? SAP NetWeaver ? Business Information Warehouse ? Links to other Systems ? General Connection Settings ? Define Logical System
    1.     A dialog box informs you that the table is cross-client. Choose Continue.
    2.     On the Change View ?Logical Systems?: Overview screen, choose New entries.
    3.     On the New Entries: Overview of Added Entries screen enter the following data:
    Field name     Description     R/O/C     User action and values     Note
    Log. System     Technical Name of the Logical System          Enter a name for the logical BW system that you want to create     
    Name     Textual Description of the Logical System          Enter a clear description for the logical BW system     
    4.     Choose Save.
    If a transport request for workbench and customizing is displayed choose existing requests or create new requests.
    If you want to continue with the next activity, do not leave the transaction.
    Result
    You have created a Logical System Name for your SAP BW client.
    2.3.3     Assigning Logical System to Client (SAP BW)
    Procedure
    To carry out the activity, choose one of the following navigation options:
    SAP BW
    Role Menu     Assigning Logical System to Client (SAP BW)
    Transaction Code     SCC4
    SAP BW Menu     Tools ? Administration ? Administration ? Client Administration ? Client Maintenance
    1.     In the view Display View "Clients": Overview, choose Display. ? Change
    2.     Confirm the message.
    3.     Select your BW client.
    4.     Choose Details.
    5.     In the view Change View "Clients": Details, insert your BW system in the Logical system field, for example, BS7CLNT100.
    6.     Save the entries and go back.
    2.3.4     Opening Administrator Workbench
    Procedure
    To carry out the activity, choose one of the following navigation options
    SAP BW     Modeling ? Administrator Workbench: Modeling
    Transaction Code     RSA1
    1.     In the Replicate Metadata dialog box, choose Only Activate.
    2.     If a message appears that you are only authorized to work in client ... (Brain 009) refer to SAP Note 316923 (do not import the support package, but use the description under section Workaround).
    2.3.5     Creating an RFC-User (SAP BW)
    Procedure
    To carry out the activity, choose one of the following navigation options:
    SAP BW Role Menu     Creating RFC User
    Transaction Code     SU01
    SAP BW Menu     Tools ? Administration ? User Maintenance ? Users
    Then carry out the following steps:
    1.     On the User Maintenance: Initial Screen screen:
    a.     Enter the following data:
    Field      Entry
    User     RFCUSER
    b.     Choose Create.
    2.     On the Maintain User screen:
    a.     Choose the Address tab.
    b.     Enter the following data:
    Field     Entry
    Last Name     RFCUSER
    Function     Default-User for RFC connection
    c.     Choose the Logon data tab.
    d.     Enter the following data:
    Field     Entry
    Password     LOGIN
    User type     System
    e.     Choose the Profiles tab.
    f.     Enter the following data:
    Field     Entry
    Profiles     SAP_ALL , SAP_NEW and S_BI-WHM_RFC
    g.     Choose Save.
    Do not change the password of this user as it is used in RFC connections.
    2.3.6     Define RFC-USER as default (SAP BW)
    Procedure
    To carry out the activity, choose one of the following navigation options
    SAP BW Role Menu     Define RFC-USER as default (SAP BW)
    Transaction Code     RSA1
    SAP BW Menu     Modeling ? Administrator Workbench: Modeling
    1.     On the Administrator Workbench: Modeling screen choose Settings ? Global Settings.
    2.     In the Global Settings/Customizing dialog box choose Glob. Settings.
    3.     On the Display View ?RSADMINA Maintenance View?: Details screen:
    a.     Choose Display ? Change.
    b.     Enter RFCUSER in the BW User ALE field.
    c.     Choose Save.
    Leave the transaction in order to activate the entries you have made.
    2.5     SAP SRM
    2.5.1     Define Client Administration
    Use
    This activity defines changes and transports of the client-dependent and client-independent objects.
    Procedure
    1.     Access the transaction using:
    SAP SRM/ Role Menu     Local Settings  ® SAP SRM ® Define Client Administration
    Transaction Code     SCC4
    2.     Switch to change mode.
    3.     Select your client.
    4.     Choose details.
    5.     Check the entries for currency and client role.
    6.     Check the settings for changes and transport of client-specific objects and client-independent object changes
    If you want to use the settings made by BC-Sets or manually in other systems (other than BW), Automatic recording of changes and Changes to Repository object and cross-client Customizing allowed is required.
    7.     In the Restrictions area, set the flag Allows CATT processes to be started.
    This flag must be set. Otherwise, activities using CATT procedures cannot be used for the installation.
    Result
    Client administration has been defined to support the installation using Best Practices.
    2.5.2     Define a Logical System for SAP SRM
    Use
    The logical system is important for the communication between several systems. This activity is used to define the logical systems for the Enterprise Buyer and back-end system.
    Procedure
    1.     Access the transaction using:
    IMG Menu
    Enterprise Buyer      Enterprise Buyer Professional Edition ? Technical Basic Settings ? ALE Settings (Logical System) ? Distribution (ALE) ? Sending and Receiving System ? Logical Systems ? Define Logical System.
    Transaction Code     SPRO
    2.     For the activity type, select Cross-client.
    3.     The following naming convention is recommended:
    Log. System     Name
    YYYCLNTXXX     Enterprise Buyer System
    4.     Save your entries
    You have to maintain at least two systems (local Enterprise Buyer system and the SAP R/3 back-end system)
    Naming Conventions: XXXCLNT123 (XXX = system ID, 123 = client number)
    2.5.3     Assign Logical System to Client
    Use
    The purpose of this activity is to define the
    ?     Enterprise Buyer client you will be using
    ?     Standard currency to be used
    ?     Recording of changes
    ?     Capability for your system to use CATT procedures
    Procedure
    1.     Access the transaction using:
    SAP SRM
    Role Menu     Local Settings  ® SAP SRM ® Assign Logical System to Client
    Transaction Code     SCC4
    2.     Switch to the Change mode.
    3.     Select your Enterprise Buyer client and go to the Client Details screen.
    4.     In the Logical system screen, choose the logical system for the client.
    5.     Set the currency in the Std currency field to a valid entry, such as USD or EUR.
    6.     Make the following settings:
    Setting     Values
    Changes and transports for client-specific objects     Automatic recording of changes
    Restrictions when starting CATT and eCATT     eCATT and CATT allowed
    7.     Choose Save.
    Using this transaction, you can change from the production client to the development client and back again in the Client role field.
    Result
    The logical system has been assigned to the client and CATT procedures can be executed now.
    2.5.4     Create System Users
    Use
    This task creates remote users RFCUSER, BBP_JOB, WEBLOGIN   for the SAP R/3 back-end system and for Enterprise Buyer.
    Procedure
    1.     Access the transaction using:
    SAP Menu
    Enterprise Buyer      Basis Tools ? Administration ? User Maintenance ? Users
    Transaction Code     SU01
    2.     Enter RFCUSER in the User field.
    3.     On the Address tab Choose Lastname RFCUSER.
    4.     Choose Create.
    5.     Enter the password LOGIN on the Logon data tab.
    6.     As User Type, select System.
    7.     Go to the Profiles tab.
    8.     Enter the profiles SAP_ALL ,SAP_NEW and S_BI-WX_RFC.
    9.     Save your entries.
    10.     Repeat this procedure to create the user BBP_JOB (Password: LOGIN).
    11.     Repeat this procedure to create the user WEBLOGIN (Password: SAPPHIRE).
    Result
    The following users have been created.
    Client     User     Password
    Enterprise Buyer      RFCUSER     LOGIN
    Enterprise Buyer     BBP_JOB     LOGIN
    Enterprise Buyer     WEBLOGIN     SAPPHIRE
         USER/Password from the Service File of the ITS Installation.
    3     Cross Connectivity
    This chapter describes all settings that are necessary to connect the components of the SAP.system landscape with each other. The settings for each combination of two components to be connected are described in a separate structure node. The separate section headings make it possible to identify the activities required to connect certain components with each other. The section headings for components that are not part of the installation can be skipped.
    3.1     Connecting SAP BW with SAP R/3, SAP CRM, SAP SRM
    Procedure
    To carry out the activity, choose one of the following navigation options in the SAP BW system:
    SAP BW Role Menu     Connecting SAP BW with SAP R/3, SAP CRM, SAP SRM
    Transaction code     RSA1
    SAP BW Menu     Modeling ? Administrator Workbench: Modeling
    1.     Choose Modeling.
    2.     Choose Source Systems.
    3.     Select Source Systems in the window on the right.
    4.     Choose the Context menu (right mouse click).
    5.     Choose Create.
    6.     Select SAP System from Release 3.0E (Automatic Creation).
    7.     Choose Transfer.
    8.     Make the following entries:
    Field     Entry
    Target computer (server)                            Server of the SAP R/3, SAP CRM or Sap SRM system
    System ID     System ID of the SAP R/3, SAP CRM or SAP SRM system
    System number     System number of the SAP R/3, SAP CRM or SAP SRM system
    Background user in source system     RFCUSER
    Password for source system     LOGIN
    Background user in BW     RFCUSER (can not be changed in this activity)
    Password for BW user     LOGIN
    9.     On the dialog box Please log on as an administrator in the following screen choose Continue.
    10.     Log on to the Source System with your administrator user. Choose the correct client.
    11.     On the dialog box New Source System Connection choose Continue.
    12.     On the Replicate Metadata dialog box, choose Only Activate.
    4.data frm SQL server thru DB Connect
    Check here....
    http://help.sap.com/saphelp_nw04/helpdata/en/a6/4ee0a1cd71cc45a5d0a561feeaa360/content.htm

  • Please tell me what is v3 update in extraction ,pabitra

    Hi sap Gurus,
                   please tell me what is v3 update in extraction ?How we use
    in extraction over real time scenarios.Advantages and disadvantages
    Definitely points will be awarded.
    advance thanx
    regards
    pabitra
    Edited by: pabitra mohanty on May 28, 2008 4:18 AM
    Edited by: pabitra mohanty on May 28, 2008 4:18 AM

    Hi Pabitra,
    Please go through the blog by Roberto on this:
    /people/sap.user72/blog/2004/12/16/logistic-cockpit-delta-mechanism--episode-one-v3-update-the-145serializer146
    and even let me present you in breif:
    V1 - Synchronous update
    V2 - Asynchronous update
    V3 - Batch asynchronous update
    These are different work processes on the application server that takes the update LUW (which may have various DB manipulation SQLs) from the running program and execute it. These are separated to optimize transaction processing capabilities.
    Taking an example -
    If you create/change a purchase order (me21n/me22n), when you press 'SAVE' and see a success message (PO.... changed..), the update to underlying tables EKKO/EKPO has happened (before you saw the message). This update was executed in the V1 work process.
    There are some statistics collecting tables in the system which can capture data for reporting. For example, LIS table S012 stores purchasing data (it is the same data as EKKO/EKPO stored redundantly, but in a different structure to optimize reporting). Now, these tables are updated with the txn you just posted, in a V2 process. Depending on system load, this may happen a few seconds later (after you saw the success message). You can see V1/V2/V3 queues in SM12 or SM13.
    V3 is specifically for BW extraction. The update LUW for these is sent to V3 but is not executed immediately. You have to schedule a job (eg in LBWE definitions) to process these. This is again to optimize performance.
    V2 and V3 are separated from V1 as these are not as realtime critical (updating statistical data). If all these updates were put together in one LUW, system performance (concurrency, locking etc) would be impacted.
    Serialized V3 update is called after V2 has happened (this is how the code running these updates is written) so if you have both V2 and V3 updates from a txn, if V2 fails or is waiting, V3 will not happen yet.
    BTW, 'serialized' V3 is discontinued now, in later releases of PI you will have only unserialized V3. (This is explained nicely in the weblog).
    /people/sap.user72/blog/2005/01/19/logistic-cockpit-delta-mechanism--episode-three-the-new-update-methods
    /people/sap.user72/blog/2005/02/14/logistic-cockpit--when-you-need-more--first-option-enhance-it
    /people/sap.user72/blog/2004/12/23/logistic-cockpit-delta-mechanism--episode-two-v3-update-when-some-problems-can-occur
    /people/sap.user72/blog/2005/04/19/logistic-cockpit-a-new-deal-overshadowed-by-the-old-fashioned-lis
    Cheers,
    Habeeb

  • What's the next step before making your site "live"

    Hi,
    I created a website and I wanted to get input on what my next step would be prior to uploading my website to the host server. It's all ready to go live I just figured there's probably some testing and validating I need to do on it to make sure everything is working as it should. So I wanted to know what are those next steps exactly?  Are there any resources (ie: a website, etc.) out there on the web that can help me make this as painless as possible? I'm still fairly new to web design so the simpler the better.
    Thanks!
    ashmic19

    Check spelling and grammar.
    Validate CSS & HTML code.
    To ensure menus, links and widgets all work, fully test your site in the 5 major browsers. If you must support older IE, get IE Tester.
    http://www.my-debugbar.com/wiki/IETester/HomePage
    Use Firefox's web developer toolbar.
    https://addons.mozilla.org/en-US/firefox/addon/web-developer/
    Web Page Analyzer & Optimization report
    http://www.websiteoptimization.com/services/analyze/
    Set-up Google WebMaster Tools and Google Analytics.
    http://www.google.com/webmasters/
    http://www.google.com/analytics/
    Nancy O.
    Alt-Web Design & Publishing
    Web | Graphics | Print | Media  Specialists 
    http://alt-web.com/
    http://twitter.com/altweb

  • Error while calculating download files automatically

    Hello All,
    I have configured Maintenance Optimizer for Solution Manager 7.0 Ehp1. When i created a new Maintenance Transaction, it is displaying an error "System is not assigned to a host, There are no operating system/database-dependent files for your selection." after choosing the stack level.
    Can anyone say what might be the problem?
    Thanks,
    Regards,
    Hasan

    Hi
    as per the workarround this error got resolved after the proper maintaining Java technical system details for Java product instance and the port details.
    hence please check and  enter the dispatcher, port number, & database schema on the Header tab, and the J2EE-Server on the Instances tab,
    check the sap notes
    Note 1176842 - Maintenance optimizer reports missing host assignment
    Note 1121045 - Missing host assignment in Maintenance Optimizer
    and the thread
    [MOPZ - System is not assigned to a host|MOPZ - System is not assigned to a host]
    Thanks,
    Jansi

  • What type of delta does LO run ?

    Hi.. I understand that in COPA extraction there are no migration steps involved as it runs on "time stamp delta". If that is the case then please let me know what type of delta does LO and LIS run... If even LO and LIS run on time stamp then please let me know why are there no migration steps involved in COPA...?

    Logistic's Area use's three Types of Delta Mechanism's :
    1. Direct Delta
    2. Queued Delta
    3. Un-serialized V3 Update
    These are different work processes on the application server that takes the update LUW (which may have various DB manipulation SQLs) from the running program and execute it. These are separated to optimize transaction processing capabilities.
    Taking an example -
    If you create/change a purchase order (me21n/me22n), when you press 'SAVE' and see a success message (PO.... changed..), the update to underlying tables EKKO/EKPO has happened (before you saw the message). This update was executed in the V1 work process.
    There are some statistics collecting tables in the system which can capture data for reporting. For example, LIS table S012 stores purchasing data (it is the same data as EKKO/EKPO stored redundantly, but in a different structure to optimize reporting). Now, these tables are updated with the txn you just posted, in a V2 process. Depending on system load, this may happen a few seconds later (after you saw the success message). You can see V1/V2/V3 queues in SM12 or SM13.
    In a local update, the update program is run by the same work process that processed the request. The dialog user has to wait for the update to finish before entering further data. This kind of update is useful when you want to reduce the amount of access to the database. The disadvantage of local updates is their parallel nature. The updates can be processed by many different work processes, unlike asynchronous or synchronous update, where the update is serialized due to the fact that there are fewer update work processes (and maybe only one).

  • Granting Authorization Group SC in S_TABU_DIS

    Hello Gurus,
    In our SAP Security Optimization Report, we have been highlighted that granting authorization group 'SC' to user will open a risk as follows:-
    "The authorization to view the data in table RFCDES makes it possible for a dictionary attack to find out the
    password of a user specified in an RFC connection."
    However, authorization group SC is also assigned to some other HR table under HRP* e.g. HRP1002.
    Can you advice what approach have you taken on granting access to this table?
    Thank you,
    Suriati

    Hi Bernhard,
    I was very curious about this and thinking about it while mowing the lawn this weekend
    I tried to emulate the checks described in my sandbox system (7.01 Bc SP 06) by modifying the function module.
    First of all, I think the checks can be transfered to a central form (just a logistics comment). If need be, STATICS could be used for CASES of the tables, which is already an import parameter (less spagetti
    Where I ran into some problems are with parameter transactions as "proxies" in the authorization maintenance for core transactions SM30, SE16, SM34 etc. I also tried the "old" tcodes.
    When you switch to the new S_TABU_NAM concept then it makes sense to not propose any S_TABU_DIS for the parameter transaction. In that case the core transaction pulls it's empty field for S_TABU_DIS each time ...
    The TSTCA entries are also a pestilence here, even if they were maintained with exact values. I maintained SU24 for my Z_TABU_NAM object so can as expected use it but there is a huge backlog of TSTCA entries for S_TABU_DIS.
    As the check is only performed (in my prototype) when S_TABU_DIS fails, it did provide certain granularity for isolated transaction codes, but with given (in my case very explicit!) standard authorizations for S_TABU_DIS in larger (single) roles if was very difficult to cleanly migrate to^the new concept (because of TSTCA and S_TABU_DIS NE Z_TABU_NAM).
    I am aware of SAP Note 1404093 but it is hard to "sell" SU24 maintenance to developers. You have it good in SAP type systems and it is still a hassle.
    Would it be possible that the expert mode merge can look for S_TABU_NAM and the automatically set S_TABU_DIS to inactive? I removed S_TABU_DIS from the core transactions as a workaround but it does not always work when it was already maintained.... This would require a "history" of the pervious merge, I guess... ?
    I will use this in new roles but am sceptical about a migration path for existing roles and existing SU24 maintenance "in the wild".
    Is there a major SAP release change at which SU22 will switch to this new concept as well?
    Cheers,
    Julius

Maybe you are looking for

  • Help with HP Officejet 4500 Wireless Printer

    We have two HP 4500 Wirless Printers. We could print through our wireless network (unsecured)  until we secured it. Now it will not print. We selected the tool icon on the printer.  Next we selected WIRELESS MENU.  Next, SET UP WIZARD.   It started s

  • Is online banking secure/safe to do over wireless network?

    ''locking as a duplicate - https://support.mozilla.com/en-US/questions/867855'' I will be traveling and would like to know if it is safe to check my credit card and checking account online over the wireless network at the hotel?

  • I accidentally closed the Pages text formatting menu bar.

    I accidentally closed the text formatting menu bar that comes up when select text to change font size, style, etc.  I cannot find out how to restore it so that it automatically opens when I select text.

  • Using the arrow keys to move objects in photoshop issue

    When I move objects around in Photoshop using the arrow keys something odd to me happens.  When I press the left arrow key it seems to move 2 pixels then when I press the right arrow key it seems like it moves 1 pixel.  I noticed this when I had 3 ob

  • How to do jaas with out using call back handler classes

    hi i successfully executed jaas under weblogic 6.1 now i wanted to implement it in our project, for that i wanted to use some GUI screens to ask for user name/password and to show error messages so my questions are 1)where i have to call GUI screens