Best practice to NOT transport reports?

Hi,
I was once told that it was best practice NOT to transport reports, templates, process chains and packages. (BW 3.5). These objects changed so often that you would spend to much time transporting. Now I have heard from others that it is usual to transport these things. What is your opinion on this?
Regards Silje

Hi,
Test queries for the user and the process chains which are needed to be created and not affecting any data loads and need not be tested in test system like deleting the PSA....these kind of objects are directly created in production and need not be transported again.
But even in the case of process chains changes are not supposed to be done directly in production.
But SOX compliance and SAP says that every object should be first changed into development and transported till production...and as you said its usual to do changes to these objects and transport them to production.
Thanks
Ajeet

Similar Messages

  • Best Practice for creating Excel report from SSIS.

    I have a requirement to create an Excel report on a daily basis which pulls data from SQL. I have attempted to resolve this by creating a stored procedure to save the results in SQL, a template in Excel to hold the graphs & pivot tables and an SSIS package
    to copy the data to the template.
    Problem 1: When the data turns up in Excel it is saved as text rather than numbers.
    Problem 2: When the data turns up in Excel it appends the data rather than overwriting it.
    I resolved problem 1 by having another sheet which converts the text to numbers (=int(sheet1!A1))
    I resolved problem 2 by adding some VB script to my SSIS package which clears the existing cells before copying the data
    The job runs fine, however when I schedule the job to run overnight it complains "System.UnauthorizedAccessException: Retrieving the COM class factory for component with CLSID". A little googling tells me that running the client side commands in
    my vb script (workSheet1.Range("A2:F9999").Clear(), workBook.Save(), workBook.Close() etc) from a server side task is bad practice.
    So, I am left wondering how people usually get around this problem; copy a SQL table into an existing Excel file and overwrite the data, without having the numbers turn up as text. My requirements are that the report must display pivot charts with selectable
    options and be automatically updated overnight.
    Help appreciated,
    Bish.
    Office 2013 on my PC, Office 2010 on the server, Windows Server 2008R2 Enterprise, SQL Server 2008R2.

    I think that the best practice in case like this is to Link an excel file to a view or directly to a table. So you don't have to struggle with changing template, with overnight packages, etc. If the data are too much complex and the desiderate too excessive
    then I tend to create a Cube and that's it...dashboard, graph and everyone is happy. In your case if the request is not too much try to don't use SSIS but directly build a view and point directly on SQL.
    SSIS is really strong for the ETL, to run some stored procedure too heavy, to use a cut time scheduled, etcetera , etcetera, etcetera...I love it. But sometimes we need to find the easier solutions...
    I hope this post helped you

  • Best Practice for Master Data Reporting

    Dear SAP-Experts,
    We face a challenge at the moment and we are still trying to find the right approach to it:
    Business requirement is to analyze SAP Material-related Master Data with the BEx Analyzer (Master Data Reporting)
    Questions they want to answer here are for example:
    - How many active Materials/SKUs do we have?
    - Which country/Sales Org has adopted certain Materials?
    - How many Series do we have?
    - How many SKUs below to a specific season
    - How many SKUs are in a certain product lifecycle
    - etc.
    The challenge is, that the Master Data is stored in tables with different keys in the R/3.
    The keys in these tables are on various levels (a selection below):
    - Material
    - Material / Sales Org / Distribution Channel
    - Material / Grid Value
    - Material / Grid Value / Sales Org / Distribution Channel
    - Material / Grid Value / Sales Org / Distribution Channel / Season
    - Material / Plant
    - Material / Plant / Category
    - Material / Sales Org / Category
    etc.
    So even though the information is available on different detail  levels, the business requirement is to have one query/report that combines all the information. We are currently struggeling a bit on deciding, what would be the best approach for this requirement. Did anyone face such a requirement before - and what would be the best practice. We already tried to find any information online, but it seems Master data reporting is not very well documented. Thanks a lot for your valuable contribution to this discussion.
    Best regards
    Lukas

    Pass a reference to the parent into the modal popup. Then you
    can reference anything in the parent scope.
    I haven't done this i 2.0 yet so I can't give you code. I'll
    post if I do.
    Oh, also, you can reference the parent using parentDocument.
    So in the popup you could do:
    parentDocument.myPublicVariable = "whatever";
    Tracy

  • Best practice of OSB logging Report handling or java code using publish

    Hi all,
    I want to do common error handling of OSB I did two implementations as below just want to know which one is the best practice.
    1. By using the custom report handler --> When ever we want to log we will use the report action of OSB which will call the Custom java class which
    Will log the data in to DB.
    2. By using plain java class --> creating a java class publish to the proxy which will call this java class and do the logging.
    Which is the best practice and pros and cons.
    Thanks
    Phani

    Hi Anuj,
    Thanks for the links, they have been helpful.
    I understand now that OSR is only meant to contain only Proxy services. The synch facility is between OSR and OSB so that in case when you are not using OER, you can publish Proxy services to OSR from OSB. What I didn't understand was why there was a option to publish a Proxy service back to OSB and why it ended up as a Business service. From the link you provided, it mentioned that this case is for multi-domain OSBs, where one OSB wants to use the other OSB's service. It is clear now.
    Some more questions:
    1) In the design-time, in OER no Endpoints are generated for Proxy services. Then how do we publish our design-time services to OSR for testing purposes? What is the correct way of doing this?
    Thanks,
    Umar

  • SAP Best Practices Business Package Transport Error

    Hi all,
           We tried to import the SAP Best Practices Business package but we are unable to import the file.  We have checked the log file ...Below is the log file...
    [code]
    #1.5#001321CB9A4500510000022100001F7800040025EB1E5E74#1126062765280#com.sap.portal.transport#sap.com/irj#com.sap.portal.transport#Administrator#1033##DEV1_EPD_9515850#Administrator#7b5495401f4b11da9472001321cb9a45#SAPEngine_Application_Thread[impl:3]_9##0#0#Error#1#/System/Server#Java### [com.sapportals.portal.transport.ui.ImportComponent] File upload failed.
    [EXCEPTION]
    #1#java.io.IOException: Bad ZIP file format: C:
    WINDOWS
    TEMP
    _htmlb11260627630311218.tmp; does not contain a package file.
         at com.sapportals.portal.transport.ui.ImportComponent.extractPackageZip(ImportComponent.java:1959)
         at com.sapportals.portal.transport.ui.ImportComponent.doFileUpload(ImportComponent.java:822)
         at com.sapportals.portal.transport.ui.ImportComponent.doOnUpdatePreviewPage(ImportComponent.java:536)
         at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
         at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
         at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
         at java.lang.reflect.Method.invoke(Method.java:324)
         at com.sapportals.portal.prt.component.AbstractPortalComponent.handleRequestEvent(AbstractPortalComponent.java:700)
         at com.sapportals.portal.prt.component.AbstractPortalComponent.handleEvent(AbstractPortalComponent.java:412)
         at com.sapportals.portal.prt.pom.ComponentNode.handleEvent(ComponentNode.java:252)
         at com.sapportals.portal.prt.pom.PortalNode.fireEventOnNode(PortalNode.java:369)
         at com.sapportals.portal.prt.core.PortalRequestManager.runRequestCycle(PortalRequestManager.java:707)
         at com.sapportals.portal.prt.connection.ServletConnection.handleRequest(ServletConnection.java:232)
         at com.sapportals.portal.prt.dispatcher.Dispatcher$doService.run(Dispatcher.java:522)
         at java.security.AccessController.doPrivileged(Native Method)
         at com.sapportals.portal.prt.dispatcher.Dispatcher.service(Dispatcher.java:405)
         at javax.servlet.http.HttpServlet.service(HttpServlet.java:853)
         at com.sap.engine.services.servlets_jsp.server.servlet.InvokerServlet.service(InvokerServlet.java:153)
         at javax.servlet.http.HttpServlet.service(HttpServlet.java:853)
         at com.sap.engine.services.servlets_jsp.server.HttpHandlerImpl.runServlet(HttpHandlerImpl.java:385)
         at com.sap.engine.services.servlets_jsp.server.HttpHandlerImpl.handleRequest(HttpHandlerImpl.java:263)
         at com.sap.engine.services.httpserver.server.RequestAnalizer.startServlet(RequestAnalizer.java:340)
         at com.sap.engine.services.httpserver.server.RequestAnalizer.startServlet(RequestAnalizer.java:318)
         at com.sap.engine.services.httpserver.server.RequestAnalizer.invokeWebContainer(RequestAnalizer.java:821)
         at com.sap.engine.services.httpserver.server.RequestAnalizer.handle(RequestAnalizer.java:239)
         at com.sap.engine.services.httpserver.server.Client.handle(Client.java:92)
         at com.sap.engine.services.httpserver.server.Processor.request(Processor.java:147)
         at com.sap.engine.core.service630.context.cluster.session.ApplicationSessionMessageListener.process(ApplicationSessionMessageListener.java:37)
         at com.sap.engine.core.cluster.impl6.session.UnorderedChannel$MessageRunner.run(UnorderedChannel.java:71)
         at com.sap.engine.core.thread.impl3.ActionObject.run(ActionObject.java:37)
         at java.security.AccessController.doPrivileged(Native Method)
         at com.sap.engine.core.thread.impl3.SingleThread.execute(SingleThread.java:94)
         at com.sap.engine.core.thread.impl3.SingleThread.run(SingleThread.java:162)
    [/code]
    Does anyone faced the similar issue as above???
    Regards
    Vasu

    Hello Vasudevan,
      Unzip the content of the Business Package zip file into some folder. While importing the business package, select the .pkg file in the unzipped folder. This solves your problem.
    Best Regards,
    Srinivas.

  • Best Practice on Not Exposing your internal FQDN to the outside world

    Exchange server 2010, sits in DMZ, internet facing. The server is currently using the Default Receive Connector. This exposes the internal fqdn to the outside world (ehlo). Since you should not (can't) change the FQDN on your Default Receive connector, what
    is the best practice here?
    The only solution I can see is the following:
    1. Change the Network on the Default Receive Connector to only internal IP addresses.
    2. Create a new Internet Receive Connector port 25 for external IP addresses (not sure what to put in Network tab?) and use my external FQDN for ehlo responses (e.g. mail.domain.com)
    3. What do I pick for Auth and Permissions, TLS and Annoymous only?
    Michael Maxwell

    Yes, it fails PCI testing/compliance. I shouldn't be able to see my internal server and domain. I understand that is the recommendation, but my client doesn't want to host in the cloud or go with a Trend IHMS (trust me I like that better, but its
    not my choice). I have to work with the deck of cards dealt to me. Thanks, just want a solution with what I have now.
    Michael Maxwell
    Understand. I wont go into the value of those tests  :)
    If the customer is really concerned about exposing the internal name, then create a new receive connector with a different FQDN  ( and corresponding cert)  for anonymous connections as you mention above. Know that  it also means internal clients
    can connect to the server on port 25 as well if you dont have the ability to scope to set of ip addresses ( i.e. a SMTP gateway).
    The internal names of the servers will also be in the internet headers of messages sent out:
    http://exchangepedia.com/2008/05/removing-internal-host-names-and-ip-addresses-from-message-headers.html
    http://www.msexchange.org/kbase/ExchangeServerTips/ExchangeServer2007/SecurityMessageHygiene/HowtoremoveinternalservernamesandIPaddressesfromSMTPheaders.html
    Twitter!:
    Please Note: My Posts are provided “AS IS” without warranty of any kind, either expressed or implied.

  • Best Practice for embedding webi reports in hyperlinks..

    SAP Community
    What is the best practice for embedding hyperlinks (email) from webi and sending to a user?so the users can quickly consume the report with minimal effort (click link and launch infoview/report).
    John

    As mentionned already,  BI has it's own inbox, and/or SMTP integration for broadcasting.
    Else, if you go to Folders, right click on your report instance, then select "Copy URL" (or 'docuemnt link' i cannot remember the exact term.)  - that woudl give you an open document link to invoke the viewer .
    Regards,
    H

  • Best practice for BW transport

    Hi .
      When i am working on BW Object transport from BW development system to
    BW QAS(TEST) system, there are two methods to activate the standard BCT object, (1) Activate it in QAS system directly, (2) tranport it from DEV to QAS.
      Can anyone tell me which method is the best way to activate standard BCT object?
      Can anyone tell me the advantage and disadvantage of activating BCT object directly in QAS system ?

    Hi Alex,
    I would always suggest you to activate the objects in the development system and then transport it across all the system so that all the systems are in sync with each other. I can take a small example to validate my opinion.
    Lets say you want to create a BEx query on an info-cube and for that you need to activate one info-object (X1). Lets say you have activated it in your dev system (otherwise you can't use that object in your query). But you have not activated the same info-object in the quality systema and hence you have not transported the info-object before to the target system.
    If you transfer the query that contains X1, then the transport will fail in Quality system and you don't know how many of these problems you are going to hit in future.
    Another example could be system refresh. Sometime we perform system refresh for integration testing (or release upgrade etc.) and you might choose to refresh qulaity system from production (common scenarion in most BW environment) then you are going to hit active / inactive objects issue if both systems are not in sync and if you have activated the business content objects directly in the quality system.
    Basically i am trying to say that follow the rule of thump that all the systems should be in sync with each other. Hence it is always beter to activate the BCT in dev system and transport them all the way through to production system.
    BCT Activation in Dev system:
    Advantages: -
    All the landscapes are in sync with each other.
    No transport failure issues due to BCT objects in future.
    No additional hassles during system refresh.
    Need not separately activate the objects in all the system (additional effort and resource required).
    Disadvantages: -
    Need to transport the BCT objects in a proper sequence. But this not really a disadvantage.
    Hope it helps.
    Thanks,
    Soumya

  • Best Practice Analyzer Results: Health Report Error EDS AlertValue Unhealthy.

    I ran the Microsoft Office 365 Best Practices Analyzer Beta 1.0 and I get the following error:
    C:\windows\system32>Get-healthreport -rollupgroup
    servername.. then I got lots of results.. I narrow it to the following!
    PSComputerName          : kaneex13.kanecpas.local
    RunspaceId              : 85204a86-04f3-4779-9cad-3092ebfe3435
    PSShowComputerName      : False
    Server                  : kaneex13.kanecpas.local
    CurrentHealthSetState   : NotApplicable
    Name                    : MaintenanceFailureMonitor.EDS
    TargetResource          :
    HealthSetName           : EDS
    HealthGroupName         : ServiceComponents
    AlertValue              : Unhealthy
    FirstAlertObservedTime  : 2/6/2015 9:12:57 AM
    Description             :
    IsHaImpacting           : False
    RecurranceInterval      : 300
    DefinitionCreatedTime   : 2/6/2015 8:58:03 AM
    HealthSetDescription    :
    ServerComponentName     : None
    LastTransitionTime      : 2/6/2015 9:12:57 AM
    LastExecutionTime       : 2/6/2015 12:38:00 PM
    LastExecutionResult     : Succeeded
    ResultId                : 57636932
    WorkItemId              : 94
    IsStale                 : False
    Error                   :
    Exception               :
    IsNotified              : False
    LastFailedProbeId       : -301690410
    LastFailedProbeResultId : 351526122
    ServicePriority         : 0
    Identity                : EDS\MaintenanceFailureMonitor.EDS\
    IsValid                 : True
    ObjectState             : New
    I try to fix it and this is my findings!!
    https://technet.microsoft.com/en-us/library/ms.exch.scom.eds(v=exchg.150).aspx
    I'm running Exchange 2013 on Server 2012

    Hi,
    Based on my research, it’s a known issue that there will be 1006 error in the application log after we install a new Exchange 2013 server:
    http://social.technet.microsoft.com/Forums/en-US/5ab1a91a-ccd4-49fb-a451-159592fc85d4/msexchangediagnostics-error-1006-logical-to-physical-size-ratio-free-megabytes?forum=exchangesvradmin
    And it can be resolved by setting the value of DriveSpaceTrigger to false:
    http://windowsitpro.com/blog/case-erroneous-disk-space-checker
    In your case, we can firstly try to restart the MS Exchange Diagnostics Service.
    Note: Microsoft is providing this information as a convenience to you. The sites are not controlled by Microsoft. Microsoft cannot make any representations regarding the quality, safety, or suitability of any software or information found there. Please make
    sure that you completely understand the risk before retrieving any suggestions from the above link.
    If you have any question, please feel free to let me know.
    Thanks,
    Angela 
    Angela Shi
    TechNet Community Support

  • Best practice for making a report of 10,000 to 20,000 rows(OBIEE 10.3.4.1)

    My Scenario is like this:*
    Hi i have 2 fact tables fact1 and fact 2 and four dimension tables D1,D2,D3 ,D4 & D1.1 ,D1.2 the relations in the data model is like this :
    NOTE: D1.1 and D1.2 are derived from D1 So D1 might be snow Flake.
    [( D1.. 1:M..> Fact 1 , D1.. 1:M..> Fact 2 ), (D2.. 1:M..> Fact 1 , D2.. 1:M..> Fact 2 ), ( D3.. 1: M.> Fact 1 , D3.. 1:M..> Fact 2 ),( D4.. 1:M..> Fact 1 , D4 ... 1:M..> Fact 2 )]
    Now from D1 there is a child level like this: [D1 --(1:M)..> D1.1 and from D1.1.. 1:M..> D1.2.. 1:M..> D4]
    Please help me in modeling these for making a report of 10,000 rows and also let me know for which tables do i need to enable cache?
    PS: There shouldn't be performance issue so please help me in modeling this.
    Thanks in Advance for the Experts who are helping me for a while.

    Shudn't be much problem with just these many rows...
    Model something like this only Re: URGENT MODELING SNOW FLAKE SCHEMA
    There are various ways of handling performance issues if any in OBIEE.
    Go for caching strategy for complete warehouse. Make sure to purge it after every data load..If you have aggr calculations at higher level then you can also go for aggregated tables in OBIEE for better performance.
    http://www.rittmanmead.com/2007/10/using-the-obiee-aggregate-persistence-wizard/
    Hope this is clear...Go ahead with actual implementation and lets us know incase you encounter any major issues.
    Cheers

  • Best Practices for Multiple Forms-Reports Instances (WebLogic) on Win2008R2

    Hello all,
    I’ve succeeded in creating two instances of Forms/Report (FR) in WebLogic and am looking at about 5 or 6 FR instances on a one Windows Server 2008R2 box. I understand each instances will have its own folder (FR_Inst1) structure under the Middleware folder. Currently I have two instances configured each with its own Home (FR_Home1 & FR_Home2) and Domain (FR_Domain1 & FR_Domain2). Both the of the separate FP applications function correctly.
    Can multiple similar type instances share a single Home (FR_Home)? Can multiple similar type instances share a single Domain outside of the WebLogic Domain, such as FR_Domain?
    Thanks,
    Ron

    Thanx for the reply! I read through the documents and they are very good at explaining how to install the different components individually. I still can't find much on installing them together. I hope it's not just going to be a trial and error thing.
    So far I've installed done the following successfully:
    Installed 10.3.5 weblogic
    Forms and Reports 11g on top of 10.3.5
    I've created an additional managed server for our ADF applications.
    My next step is upgrading the JSF to 2.x. I would have to stage patches 12917525 and 12979653. I'm afraid it will break the forms and reports though. Any ideas?

  • In Oracle, which is best practice for 'NOT EQUAL TO'

    I need to check for where a decimal value is not zero.
    Am I better to use
    'less than symbol or greater than symbol' 0
    or
    != 0or does Oracle substitute 'less than symbol or greater than symbol' to != effectively ?
    In my mind if it doesn't the 'less than symbol or greater than symbol' would be slower because it has to check if the value is greater than OR less than.
    I appreciate != is not SQL-92 compliant.
    I appreciate it would be less than a nano second difference but they all add up!
    Thanks
    The 'less than symbol or greater than symbol' do not show it seems as they are used in HTML.
    Edited by: cubmar on Jun 5, 2009 11:16 AM
    Edited by: cubmar on Jun 5, 2009 11:21 AM

    Even though The CBO is cleverly written program it does makes its assumptions. When ever the Optimizer finds a NOT operator it assumes that is going to fetch a large portion of the data and goes for a FULL TABLE SCAN. INDEX SCAN Is not an option.
    And hence if you are not going to select a large portion of the data using NOT can be expensive.
    Below is an example.
    i have created a table which has one record with value as 1 and 99999 records with value as 0.
    SQL> create table t
      2  as
      3  select decode(level,1,1,0) no, rpad('*',100,'*') name
      4    from dual
      5  connect by level <= 100000
      6  /
    Table created.
    SQL> create index t_idx on t(no)
      2  /
    Index created.
    SQL> exec dbms_stats.gather_table_stats(user,'T',cascade=>true)
    PL/SQL procedure successfully completed.so when i use the condition no != 0 i am gong to select only 1 record. So i expect the optimizer to go for a INDEX RANGE SCAN but see what happens.
    SQL> explain plan
      2  for
      3  select * from t where no != 0
      4  /
    Explained.
    SQL> select * from table(dbms_xplan.display)
      2  /
    PLAN_TABLE_OUTPUT
    Plan hash value: 1601196873
    | Id  | Operation         | Name | Rows  | Bytes | Cost (%CPU)| Time     |
    |   0 | SELECT STATEMENT  |      |     1 |   103 |   345   (2)| 00:00:05 |
    |*  1 |  TABLE ACCESS FULL| T    |     1 |   103 |   345   (2)| 00:00:05 |
    Predicate Information (identified by operation id):
    PLAN_TABLE_OUTPUT
       1 - filter("NO"<>0)
    13 rows selected.But with the use of < or > you can go for a INDEX RANGE SCAN.
    SQL> explain plan
      2  for
      3  select * from t where no >0 or no <0
      4  /
    Explained.
    SQL> select * from table(dbms_xplan.display)
      2  /
    PLAN_TABLE_OUTPUT
    Plan hash value: 4259936809
    | Id  | Operation                    | Name  | Rows  | Bytes | Cost (%CPU)| Time     |
    |   0 | SELECT STATEMENT             |       |     2 |   206 |     6   (0)| 00:00:01 |
    |   1 |  CONCATENATION               |       |       |       |            |          |
    |   2 |   TABLE ACCESS BY INDEX ROWID| T     |     1 |   103 |     3   (0)| 00:00:01 |
    |*  3 |    INDEX RANGE SCAN          | T_IDX |     1 |       |     2   (0)| 00:00:01 |
    |   4 |   TABLE ACCESS BY INDEX ROWID| T     |     1 |   103 |     3   (0)| 00:00:01 |
    |*  5 |    INDEX RANGE SCAN          | T_IDX |     1 |       |     2   (0)| 00:00:01 |
    PLAN_TABLE_OUTPUT
    Predicate Information (identified by operation id):
       3 - access("NO"<0)
       5 - access("NO">0)
           filter(LNNVL("NO"<0))
    19 rows selected.So you must be cautious when you use NOT operator.

  • Best Practice for report output of CRM Notes field data

    My company has a requirement to produce a report with variable output, based upon a keyword search of our CRM Request Notes data.  Example:  The business wants a report return of all Service Requests where the Notes field contains the word "pay" or "payee" or "payment".  As part of the report output, the business wants to freely select the output fields meant to accompany the notes data.  Can anyone please advise to SAP's Best Practice for meeting a report requirement such as this.  Is a custom ABAP application built?  Does data get moved to BW for Reporting (how are notes handles)?  Is data moved to separate system?

    Hi David,
    I would leave your query
    "Am I doing something wrong and did I miss something that would prevent this problem?"
    to the experts/ gurus out here on this forum.
    From my end, you can follow
    TOP 10 EXCEL TIPS FOR SUCCESS
    https://www.sdn.sap.com/irj/scn/go/portal/prtroot/docs/library/uuid/204c3259-edb2-2b10-4a84-a754c9e1aea8
    Please follow the Xcelsius Best Practices at
    https://www.sdn.sap.com/irj/scn/go/portal/prtroot/docs/library/uuid/a084a11c-6564-2b10-79ac-cc1eb3f017ac
    In order to reduce the size of xlf and swf files follow
    http://myxcelsius.com/2009/03/18/reduce-the-size-of-your-xlf-and-swf-files/
    Hope this helps to certain extent.
    Regards
    Nikhil

  • The best practice for creating reports and dashboard

    Hello guys
    I am trying to put together a list of best practice on how to create reports and dashboards using OBIEE presentation service. I know a lot of those dos and donts are just corporate words that don't apply consistantly in real world environment, but still I'd like to know if Oracle has any officially defined best practice or not.
    the only best practice I can think of when it comes to building reports and dashboards is:
    Each subject area should contain only one star schema that holds data for a specific business information
    Is there anything else?
    Please advice
    Thanks

    Read this book to understand what a Dashboard is, what it should do and look like to be used by the end users. Very enlightentning.
    Information Dashboard Design: The Effective Visual Communication of Data by Stephen Few (There are a couple of other books by Stephen and although I haven't read them yet, I anticipate them to be equally helpful.
    This book was also helpful to me:
    http://www.amazon.com/Performance-Dashboards-Measuring-Monitoring-Managing/dp/0471724173
    I also found this book helpful in Best Practices...
    http://www.biconsultinggroup.com/knowledgebase.asp?CategoryID=337

  • Best practice: Developing report in Rich Client or InfoView?

    Hi Experts,
    I have a question on the best practice of developing webi reports.
    From what I know, a Webi report can be created in Rich Client and then exported to one or more folders. From InfoView, the report can also be changed, but the change is only local to the folder.
    To simplify development and maintenance, I believe both creation and change should be done solely in either Rich Client or InfoView. However, some features are only available in InfoView, not in Rich Client. One example is hyperlink for another Webi report. As a second step, I can add the extra features in InfoView after the export. However, if I change the report in Rich Client and re-export it, the extra features added via InfoView (e.g. report hyperlink) will be overwritten.
    As I'm new to BO, may I have some recommendations on the best practice for building reports? For instance:
    1) Only in Rich Client - no adding of feature via InfoView
    2) First in Rich Client, then in InfoView - extra features need to be added again after each export
    3) Only in InfoView -  all activities done in InfoView, no development in Rich Client
    4) Others?
    Any advice is much appreciated.
    Linda
    Edited by: Linda on May 26, 2009 4:28 AM

    Hi Ramaks, George and other experts,
    Thanks a lot for your replies.
    For my client, the developers will build most of the reports for regular users to view. However, some power users may also create their own reports to meet ad-hoc reporting requirements.
    It's quite unlikely for my client to develop reports based on Excel or CSV data files. And we need to use features such as hyperlink for documents (which is not available in Rich Client). Based on these considerations, I'm thinking of doing all development in InfoView (both developers and power users). Do you foresee any issue if I go for this approach?
    Thanks in advance.
    Linda

Maybe you are looking for

  • 20" studio display & AGP G4

    i bought a 20" flat panel LCD acrylic studio display and can't get it to work with my power mac G4 AGP "sawtooth". i have the display hooked via DVI to ADC adapter like i did with my 17" flat panel LCD acrylic studio display, which worked fine. the 2

  • 5.1 SP10 How do you programmatically make a datasource from a dynamically created Pool?

    How do you programmatically make a data source from a dynamically created Pool? I can make the Pool OK but can not find any classes the return a datasource given a JDBC Pool. Thanks in advance

  • XL reporter in Excel 2013

    hi all, is XL reporter compatible with Excel 2013? Were using SAP b1 version 9. XL reporter is giving error message  Security settings in ms excel prohibit xl reporter from running.. VBA in macro settings already enabled. thank you for your help

  • Logic pro won't install on macbook pro

    I have Logic Studio 9.1.5 working on my iMac i7 under OS X 6.8.  It is an upgrade of an upgrade, etc going back to V 5.5.  I have the original Emagic key from Logic 5.5.  My recollection is that I had to use this to install the Logic Studio upgrade t

  • How to keep track of sessions ?

    Hi, Recently at an interview I was asked that how do you keep track of n number of user sessions in JSP ? Can somebody help me with the answer that I should have given ? Thanks.