Want to check which data source is used by the composites

HI All,
I have a requirement here in which i want to see that which data source is used by the composites deployed on SOA.
I dont have the code.
Is there any quick way to do it(from console or backend without downloading the code)
Thanks

You can export the jar file from the EM console.
1) Login to EM console
2) Right click on the composite and select Export and With default options click on Export button
Find the data source name from the exported code.
Mark the posting appropriately as "helpful" or "correct answer", if your issue is solved.
Regards
Albin I
[http://albinsblog.com/]

Similar Messages

  • Which data source of many to use, based on user entry of month

    Looking for hints relating to BEx queries inside the Web Application Designer's web templates.
    Currently a query is using 1 very, very large data source, such as payroll records.
    If data is divided into smaller portions, such as monthly data sources....
    How can we:
    - establish a link between the variable entry screen where the user selects 1 specific month and
    - force the query to return results from that 1 month's data source?  
    In other words, we are looking for a way to build intelligence into the query for it to make a decision about which data source to use, based upon the parameters that end user enters.... with the hopes that the process will return the results faster, because it has less data to weed thru to get to the results.
    Thanks.

    Hi Sharon,
    As U said the datasource is a huge one. Now let's say these datasource is feeding its data to only 1 cube IC1. After the data to the cube is loaded, there will be huge junk of data from the source, so, inorder to improve the reporting performance with respect to this cube you got different options of creating Indexes, Aggregates etc., on to this cube.  Again coming the BEx, there also U have other options to improve your query performance.
    As per your requirement, U want to display data with respect to a month which the end user enters. For this I guess U can have a variable entry for the time characteristic Calendar Month/Year. By this way, U can get the desired results for the query.  I suggest, U can even improve the performance by having range values (or) single value restrictions on the time characteristic Calendar Month/Year, if U are sure for which calendar months (or) years the data is to be queried upon.
    Hope this should serve your purpose.
    Regards,
    Syed.

  • Which data sources I need to use for Actual&Plan comparision of GL Account?

    Could you please let me know which datasources I need to use for Actual & Plan comparision of GL Account data?
    Current SAP BI version is BI 7.0.
    As per my knowledge, I am thinking of using the datasources 0CO_OM_CCA_1 for Plan data & 0FI_GL_10 for Actuals.
    Is this right?
    I have one more question here:
    While extracting the data from ds 0CO_OM_CCA_1, I am getting only the Profit & Loss Account data where as the Balance Sheet Account data is not coming.
    And also there is some confusion in VTYPE because the standard ds 0CO_OM_CCA_1 is giving the data with Value type 10 is & Version 000.
    As we all know the VTYPE '10' stands for Actuals then how can we say that this ds 0CO_OM_CCA_1 gives Plan data?
    Please clarify.

    Hi,
    Basically 0CO_OM_CCA_1 data source is used to extract actual, plan, and commitments.
    This data is differentiated with value type, you can see sap note 523742 for more details on value type.
    For balance sheet accounting data check the below link,
    [http://help.sap.com/saphelp_nw04s/helpdata/en/5d/ac7d8082a27b438b026495593821b1/content.htm]
    Regards,
    Durgesh.

  • Problem setting up data source to use ODBC DSN

    SQL server 2008 r2 BI Tools 2008 r2
    When creating new data source based on 64bit ODBC DSN
    The DSN is not displayed in the 'Shared Data Source Properties/Edit/Connection Properties/ 'Use user or system data source name' drop down, the only DSN's displayed are 32 bit not the 64 bit ones I need.
    How can I access the 64 bit DSN's I need?

    Hi,
    Based on your description, I am afraid your issue is out of support range of VS General Question forum which mainly discusses
    the usage of Visual Studio IDE such as WPF & SL designer, Visual Studio Guidance Automation Toolkit, Developer Documentation and Help System
    and Visual Studio Editor.
    In order to help you resolve your issue, could you please provide us more information, we will check whether other forums can support your issue?
    What do you want to do?
    What are you doing?
    Which tool are you working with? You only mentioned ‘SQL Server 2008 R2 BI Tools 2008 R2’, are you working in SQL Server? Or Visual Studio? Did you use BI Tools within Visual Studio?
    Could you provide us an image about what you encountered currently?
    Thanks,
    We are trying to better understand customer views on social support experience, so your participation in this interview project would be greatly appreciated if you have time. Thanks for helping make community forums a great place.
    Click
    HERE to participate the survey.

  • Can we set the dynamic data source when using getReportParameters() ?

    Hello!
    I have a report where one of its parameters refers to a list of values (LOVs). This list of values is an SQL Query type. When the data source used in the report is defined in the BI Publisher server, I'm able to get the report parameters using the getReportParameters() function in my application. However, if the data source is not defined the function throws an exception, which is understandable.
    I decided to dynamically set the data source so that even if the data source used by the report is not defined in the BI Publisher server, it still will be able to get the LOVs for the parameter. I tried setting the JDBCDataSource of the dynamicDataSource for the ReportRequest object that I passed to the getReportParameters() function. Please see the sample code below:
    reportRequest.dynamicDataSource = new BIP10.BIPDataSource();
    reportRequest.dynamicDataSource.JDBCDataSource = new BIP10.JDBCDataSource();
    setReportDataSource(reportRequest.dynamicDataSource.JDBCDataSource, connectstr, jdbc, dc); //function to set the values for JDBCDataSource object
    reportParams = webrs.getReportParameters(reportRequest, uid, pwd); //call the getReportParameters
    I was expecting this to work as this is what I did to dynamically set the data source before calling the runReport function. So, my question is -- can we set the dynamic data source when using getReportParameters() ? I tried this both in versions 10g and 11g. It does not seem to work on both versions.
    Regards,
    Stephanie

    report_id column of apex_application_page_ir_rpt can help us uniquely identify each saved report.
    We can assign this report_id value to a page item and this page item can be put in the Report ID Item text box of the Advanced section of the Report Attributes page of the IR.
    This should load the saved report identified by report_id and you can get rid of javascript
    Regards,
    Vishal
    http://obiee-oracledb.blogspot.com
    http://www.packtpub.com/oracle-apex-4-2-reporting/book
    Kindly mark the reply as helpful/correct if it solves your problem

  • How to find which data elements are used in ABAP program ?

    I wish to find out which data elements are used in some programs .
    Is there any DD table which stores this information ?
    I know we can go to program and look in fields, but do SAP stores this information somewhere ?

    Dear Kaushal,
    Please check the table <b>TRDIR</b>.
    Regards,
    Abir
    Don't forget to award points *

  • Create Data Source Module using OMB

    I am not seeing how to create Data Source modules using OMB. I am assuming there is a property of OMBCREATE ORACLE_MODULE that I am overlooking. If anyone can provide assistance, it would be greatly appreciated.
    Thanks,
    craig

    Hi,
    Yes, there is a OMBCREATE command for creating Oracle Modules. And as per the Scripting ref Guide ( Page 4-157 owb 9i scripting guide) -
    OMBCREATE ORACLE_MODULE 'src_module' SET PROPERTIES (DESCRIPTION, BUSINESS_NAME) VALUES ('this is an Oracle module', 'source module') This will create an Oracle module named "src_module", its description is "this is an Oracle module", and business name is "source module".
    HTH
    Mahesh

  • Posting data source name using url

    Hi!
    Is it possible to send different data source for report than the default one when i call report using url link?

    Hi
    Not currently supported - its on the list of to-dos thou - no date as yet
    Regards
    Tim

  • Check which data records have been updated or created by the business rule.

    Dear all,
       Is there any way to check which data records have been updated or created by the business rule?
       For ex:
       When I run a data package which execute some standard store procedure and the log shows :
    Executing SPRUNCONVERSION  'LEGALAPP', [ACTUAL], [GGGG],'GLOBAL','SPSCOPE_42412','SPLOG_125272'
    SPRunConversion Version 2.06
    20071200 - 210 Rows Calulated
    20071200 - 84 Rows Updated
    Anyway to see the data content of "84 Rows Updated "?
    Best Regards,
    Jeff

    Yes each aplication has 3 fact tables:
    FACTWB:Real time storage, when you send data it is stored here and this is linked to the FACTWB partition in OLAP which is a ROLAP partition to enable real time updates of the data.
    FAC2: Short term storage. If too many records are in the FACTWB the aplication becomes slow because these records need to be queried by the OLAP engine everytime new records are send to the database. To clean up te real time storage you can do a lite optimize, which will move the FACTWB records to the FAC2 partition. this lite optimize can be done even when users are in the system.
    FACT: this is the long term storage. When your reporting is finished you can run a full optimize to clean up both FACTWB and FAC2 tables and store it in the FACT table. This contains all your historical records.
    Evertime you run a reort, SAP BPC is looking at all 3 partitions and combines the data of the 3 partitions to present the correct value in your report. Remember that BPC is posting differences in the WB and not the final values. So if you have for example a record with 10 in the FAC2 partition and you send 12 to the database for that intersection, BPC will post only 2. When running a report it combines the 10 and the 2 which makes 12.
    Hope this makes it more clear.
    Joost
    Edited by: Joost Hoppenbrouwers on May 6, 2008 12:19 PM : Seems that James and I where writing pretty much the same post at the same time

  • Sign in with username and password every time I want to check emails on my PC using iCloud?

    Signing in with username and password every time I want to check emails on my PC using iCloud
    Having convinced my parents to get rid of their old AOL account and sign up for iCloud to sync their new iPhone with their old PC running Windows 7, I thought I was on to a winner. All working OK but is there any other way to get into emails besides going on iCloud website and signing in with name and password everytime? Can it be set up as a Mail account directly like on a Mac? Or using Outlook or Entourage? Help! I'm not a PC person!!
    ELF

    If they have Win 7 or 8, and Outlook 2007 or newer then you can.
    Just download the iCloud Control Panel for Windows. Install and select the services you want (Mail Contacts and Calendars will appear in Outlook)

  • Data source Enhancements Using Badis

    HI gurus,
    Can you please let me know the process in detail to implement the data source enhancements using badi's.
    Please provide me the step by step proceedure.
    Appropriate answer will be highly rewarded.
    Thanks,
    Shashikanth.

    http://www.sap-img.com/abap/difference-between-badi-and-user-exits.htm
    I have a few links for u! Just download the documents and have a reading. You will get an idea:
    http://www.esnips.com/doc/e06e4171-29df-462f-b857-54fac19a9d8e/ppt-on-badis.ppt
    http://www.esnips.com/doc/43a58f51-5d92-4213-913a-de05e9faac0d/Business-Addin.doc
    http://www.esnips.com/doc/10016c34-55a7-4b13-8f5f-bf720422d265/BADIs.pdf
    http://www.esnips.com/doc/1e10392e-64d8-4181-b2a5-5f04d8f87839/badi.doc
    /people/thomas.weiss/blog/2006/04/18/how-to-implement-a-badi-and-how-to-use-a-filter--part-4-of-the-series-on-the-new-enhancement-framework
    https://www.sdn.sap.com/irj/sdn/go/portal/prtroot/docs/library/uuid/4493c4a6-0301-0010-62a2-efd6ded61e04
    Reward points if helpful
    Regards
    Edited by: Aadil on Jan 23, 2008 8:42 AM

  • Data Connection Library used in the workbook is not in a trusted location.

    I got a requirement to display excel charts with data source as SharePoint list.
    Here are the steps which I am following:
    Exported SharePoint list to excel sheet and exported connection file to local drive.
    Uploaded to data connection library.
    Created a new excel file and consumed data from connection file which was stored in data connection library.
    Published the excel sheet to document library with publish options as Chart and Pivot Table.
    Able to see Excel Chart and Pivot table from the browser.
    Getting Error when I try to refresh the excel sheet to get updated data from SharePoint list.
    I added data connection library details to Trusted Data Connection & Trusted File locations still I am getting error. That .odc file is not in Trusted Location.
    "The Data Connection File used in the workbook is not in a trusted location. The following connections
    failed to refresh:"
    Please help me on this and suggest if there is any other approach to fulfill this requirement.

    Hi Pratik,
    Thanks for your input.
    I created a new document library and data connection library and tried the steps mentioned below:
    Configuration
    Enable Claims to Windows Token Service
    The first step, check the status of the Claims to Windows Token Service on SharePoint.
    Use the following instructions to check and enable the Claims to Windows Token Service.
    In Central Administration, in System Settings, click Manage services on server.
    Select Claims to Windows Token Service, and then click Start.
    Verify the service is also running in the Services console:
    In Administrative Tools, click Services.
    Start the Claims to Windows Token Service if it is not running.
    Create a New Secure Store Application
    Create a New Secure Store Target Application and set the credential.
    This credential will be used by SharePoint to access the database. The account set into this step need to have access to the database or SSAS cube/tabular databases.
    Use the following instructions to create and set credential.
    In Central Administration, in Application Management, click Manage service applications.
    Click Secure Store Services, and then click New.
    Target Application Settings:
    Tagert Application ID: ExcelServicesSSS
    Display Name: Excel
    Service Secure Store App
    Contact e-mail: [email protected]
    Target Application Type: Individual
    Target Application Page URL: None
    Click Next.
    Click Next again.
    Target Application Administrators: contoso\administrator
    (Type the user account that will administrate the Secure Store Application.
    Click OK.
    Select Target Application ID ExcelServicesSSS, and then click Set Credential.
    Type Credential Owner, Windows User Name and Password.
    Configure Excel Services
    Add the Application Id created in the previous step to the Excel Services and configure the Trusted Data Connection Library.
    Use the following instructions to add the Application ID.
    In Central Administration, in Application Management, click Manage service applications.
    Click Excel Services Application, and then click Global Settings.
    On External Data, Application ID type ExcelServicesSSS.
    Click OK.
    Click Trusted Data Connection Libraries.
    Click Add Trusted Data Connection Library.
    Address: http://<sharepoint_site>/ (Enter
    the Data Connection Address)
    Click OK.
    But no luck in resolving the issue. Anyhelp would be really thankful.
    Please clarify whether can we pull data from sharepoint list to excel services to display charts?
    Regards,
    N.Srinivas

  • How can I change (or select) what is being being used when I share, as the file cover. When I Share lets say "Master File" i want to select a frame that is used as the file cover. So when its on my desktop I see this image as the file icon.

    How can I change (or select) what is being being used when I share, as the file cover. When I Share lets say "Master File" i want to select a frame that is used as the file cover. So when its on my desktop I see this image as the file icon.

    I don't think Finder does this (I've tried).
    iTunes does though. Where you can set artwork or the "poster frame"...
    This may not be what you want but if it helps, I know 2 ways  do this is
    Open the video in QuicktimePlayer7 | View | Set Poster Frame (even then, you might need to save it as .mov (ie in a 'mov container').
    Drag the file into iTunes and set the artwork (as in http://www.dummies.com/how-to/content/adding-album-cover-art-or-images-in-itunes .html)
    From there, iTunes will use that frame as the "poster frame" ie the photo/frame that shows when you browse your videos. Which is what you want, but limited to iTunes.
    When I do either of these above, the frame I set does not show when exploring files in "Finder" (or in the other Explorer tool I use called "Pathfinder").
    So it maybe, that exactly what you want, is not possible.

  • What type of Data target is used when the safety interval is used?

    Hi ,
    what type of Data target is used when the safety interval is used in generic delta in datasource?DSO or cube  or cube?
    Pls explain me ?
    Thanks,
    Nimai

    It's up to your reporting requirement to go for dso or cube..
    I prefer to having staging with DSO and then further reporting on Cube ..
    It's always easier to manage when you have any changes to the cube .. you need not to extract data from R/3 .. if you have dso you can extract directly to cube
    Another thing is -for any reason some of your records got corrupted so you can have full loads to dso which has the overwrite functionality...  and sends delta records to cube. even in cube you can do selective deletion but we have to very careful to avoid double records in next loads
    Edited by: Srinivas on Jul 29, 2010 8:23 PM

  • Target data source does not support the AGO operation

    Hi,
    In BI Admin Tool, I join Essbase cube and relational source. Then I apply Ago function to Essbase measures. In BI Answer, I try to run query that includes Essbase Ago measures and relational columns(non measures), error message shows the following detail:
    Error Codes: OPR4ONWY:U9IM8TAC:OI2DL65P
    State: HY000. Code: 10058. [NQODBC] [SQL_STATE: HY000] [nQSError: 10058] A general error has occurred. [nQSError: 22001] Target data source does not support the AGO operation. (HY000)
    When I remove the relational columns or run Essbase current-date messures, the result is fine.
    So, what's exact meaning of this error message? and does the relational columns (non-measure) support Essbase measures' Ago function?

    to clarify:
    fail case:
    criteria
    YEAR | YTD,gen03 | MONTH_NAME | SALES(YEAR_AGO)
    cube dimension: year, ytd,gen03
    relational source: month_name
    cube measure using AGO(): sales(year_ago)
    result: error messageSuccess case:
    criteria
    YEAR | YTD,gen03 | SALES(YEAR_AGO)
    cube dimension: year, ytd,gen03
    cube measure using AGO(): sales(year_ago)
    result: success~! how can i solve it? thx

Maybe you are looking for

  • Line item not relevant for payment release

    Dear Expert, When I select payment block : 'Payment Request' in FB60. Error Message appear "Line item not relevant for payment release". I also have set payment block in master vendor and  OBB8. Can you tell me why the error message still appear? Kin

  • Mysql driver and Jbuilder 2005 foundation

    I am trying to connect to a mysql server using jdbc and the mysql drivers. Specifically version 3.1.7. I will post my code here although I believe the code is correct and that I have a classpath issue with jbuilder. The error I am receiving on both b

  • Itunes 8.1 download

    I downloaded 8.1 over a week ago and since then can't access the store. I was able to access a "concern" using the home button in podcasts, but still can't open the store, and can't download podcasts either. Error message (504) or occasionally says m

  • Is macbook or macbook pro the best for logic studio

    I was wondering if i just bought a macbook and maxxed it out would pro logic 8 run good on it or should i just invest in a macbook pro? I guess my question is what is the difference in the 2 as pertaining to running pro logic studio? I would also be

  • How to call *.C file in LV

    Hi frenz, i have created .C file for addtion of two numbers. Now i need to call that file from LabVIEW. How is it possible?. Thanks in Advance