Source data for pipeline report

Greetings,
I wanna to create a pipeline reports that demonstrate:
- Number of Leads
- Number of Leads converted in Opportunities
- Number of Opportunities
- Number of Won Opportunities
I used the funnel design key for leads, but does lack a filter w/ how many opportunities have been won .
Would anyone have a suggestion ?
Thanks,
Julio Zarnitz

Hi Julio,
Since single report doesn't suffice all your the field requirement, you can use combined data source and map the desired key figures and characteristics to get required results
(e.g. you can try out combination of  Lead funnel & Opportunity funnel as both will have all your required fields)
Refer below thread for detailed steps
http://scn.sap.com/docs/DOC-63151
Regards,
Surjeet Bhati

Similar Messages

  • [APEX 3] Requested source data of the report has been modified

    Hello APEX-Friends,
    I have a common problem but the situation is a bit different here. Many of you might know the "invalid set of rows requested the source data of the report has been modified" problem. Often it occurs on submit. That means, you have a report, you select rows, you do things, you submit the page and everything blews up.
    This is because you enter some values into fields the report depends on and so you modify your report parameters and the source data changes.
    But:
    In my case I have a dynamically created report that blews up before any submits occur or values change.
    My query is a union of two selects. Both query different views. Those views use a date field as parameter and some compare functions.
    I read the field with a V-Function i wrapped arround the apex V Function - declared as deterministic. My date compare function is also declared deterministic (I doubt this makes any differences as it might be only important for the optimizer, but as long as I don't know exactly what APEX evaluates, I go for sure).
    I ensured, that the date field is set by default with the current date (and that works, because my interactive report initially displays correct data from the current date).
    So everything is deterministic and the query must return same results on subsequent calls, but APEX still throws this "source data has changed" error and I am to 99.99% sure, that this cannot be true.
    And now the awesome thing about this:
    If I change the value of the date field, an javascript performs a submit. The page is reloaded (without resetting pagination!) and everything works fine. I can leave the page, reenter, do things - everything works well.
    But if I log into the application and directly move to the corrupted report and try to use the pagination without editing fields or submitting the page the error occurs.
    Do you have any Idea what's happing there? I could try to workaround this by submitting the page the first time it's entered to trigger this "mystery submit" that gets everything working. But I would like to understand this issue and have a clean solution.
    Thanks in advance,
    Mike aka UniversE

    Okay, I found a solution, but I do not understand it - it might be a design flaw in APEX.
    I mentioned the date field that is used in the query. I also mentioned that it is set with the current date by default. I did not mention how.
    There are some possibilities in APEX to do so.
    1. Default-Setting in the element properties
    2. Static assignment if no value is in session cache
    3. Computation before header
    I did the first and second.
    BUT:
    An interactive report seems to work as follows. A query is executed to get all rows of the report. Then a second query is executed to get the rows that shall be displayed. And the order is screwed up, I think.
    1. The first report query to get all rows
    2. The elements are loaded and set to default values
    3. The second report query to get the display rows
    And that's the reason why nothing worked. Scince I added a computation before header the date field is set before the report queries are executed and everything works all fine now.
    But I think it's a design flaw. Either both queries shall be executed before Regions or afterwards but not split as field values might change when elements are loaded.
    Greetings,
    UniversE

  • There is no source data for this data record, Message FZ205

    Hi Experts,
    I am facing a problem with the DME File download. This problem happened all of sudden in our production system since last month and it was never before. Our system landscape has also not been changed but as per our basis consultant he has added two-three more new application server to the Production client. Even we do not have this problem in our testing clients.
    Please note that we have been using the output medium '1' from the day one and thus the system has been generating the DME in 'File System' which we download on the desktop and upload the same to the bank online. After running the payment run when we trying to download the DME File, the system gives the error "There is no source data for this data record, Message FZ205".
    I tried to fix this issue through many ways but not able to. So can you please let me know the reason of this error and solution to fix this.
    With best regards,
    BABA

    Hi Shailesh,
    Please share how you solved this problem.
    Many Thanks,
    Lakshmi

  • No data for daily report sent from my PC

    Since a clean install of Windows 7 Ultimate on my PC, no data is being sent to the Cisco server from my PC. Previously I was running the RC version of Windows 7 Ultimate on this PC, with the same router and configuration, and no problems in being able to access a full daily report. I was using Outpost Security Suite Pro, which allowed the data to be sent, but am now using the native Windows  firewall, configured to allow NM outward communications.
    I asked Support to assist with configuring the Windows  firewall, but they were unable to assist, as they did not know how to do this- I would have thought that as NM is Windows 7-compatible, this information would have been available to them.
    My netbook is running Windows 7 Ultimate RC, with OUtpost Security Suite Pro, and does send the data for the daily report, when using my home wireless network and the same router, so I suspect that the Windows 7 firewall is causing the problem with the PC.
    Can anyone advise me on the specifics of configuring the Windows 7 firewall to allow NM to send the necessary data so that I can get the daily report?
    Running Network Magic version -5.5..9195.0-Pure. ADSL-2 connection; Netcomm Router NB6Plus4W;PC connection to router wired; PCI Adaptor card; VIPRE Antivirus and Antispware (disabling this made no difference)

    ambergris77 wrote:
    Since a clean install of Windows 7 Ultimate on my PC, no data is being sent to the Cisco server from my PC. Previously I was running the RC version of Windows 7 Ultimate on this PC, with the same router and configuration, and no problems in being able to access a full daily report. I was using Outpost Security Suite Pro, which allowed the data to be sent, but am now using the native Windows  firewall, configured to allow NM outward communications.
    I asked Support to assist with configuring the Windows  firewall, but they were unable to assist, as they did not know how to do this- I would have thought that as NM is Windows 7-compatible, this information would have been available to them.
    My netbook is running Windows 7 Ultimate RC, with OUtpost Security Suite Pro, and does send the data for the daily report, when using my home wireless network and the same router, so I suspect that the Windows 7 firewall is causing the problem with the PC.
    Can anyone advise me on the specifics of configuring the Windows 7 firewall to allow NM to send the necessary data so that I can get the daily report?
    Running Network Magic version -5.5..9195.0-Pure. ADSL-2 connection; Netcomm Router NB6Plus4W;PC connection to router wired; PCI Adaptor card; VIPRE Antivirus and Antispware (disabling this made no difference)
    Hi ambergris77,
    First only one Software can be turned on within the operating system, otherwise you have problems. Also Network Magic needs Full Access, not just Outward Access.
    thecreator - Running Network Magic version -5.5..9195.0-Pure0 on Windows XP Home Edition SP 3
    Running Network Magic version -5.5.9195.0-Pure0 on Wireless Computer with McAfee Personal Firewall Build 11.5.131 Wireless Computer has D-Link DWA-552 connecting to D-Link DIR-655 A3 Router.

  • Using SQL queries via ODBC connection to obtain data for a report

    Post Author: adhiann
    CA Forum: General
    Hi All,
    I was trying to run a Crystal Reports report through Clearquest using an SQL query I designed in CQ to give me the data I need, however, from within ClearQuest, I cannot associate an SQL query on a report, so I went directly to Crystal Reports and am using the ODBC connection to the ClearQuest database to run the report. I selected the right tables and fields and am using the same formula that I used on ClearQuest's SQL query to get the data, however, the report doesn't return any data whereas I am getting at least 5 records from Clearquest for the same query.
    Granted there's a difference in the way you create a SQL formula in Crystal, but I don't know if I'm doing it right as I've never had to use CR as a standalone product.
    Is there a way I can directly plug my ClearQuest's SQL query into the CR and run it? I don't know why the results won't return any data for a query that has some results
    Thanks in advance
    nandita

    Post Author: Roscoe1822
    CA Forum: General
    Did you try to add a command through the Database Expert? Also creating a Business View that contains your sql query might help as well.

  • How to Integrate SAP EP with SAP PI to send data for creating reports in BW

    Hi Gurus
    I am struck with an Object/Interafce Creation in my project. My requirement is to Integrated SAP Enterprise Portal and SAP BW using SAP Pi. I am working on SAP PI7.1. The SAP EP sits on the Oracle Database and it will send some data required for generating some reports in BW.
    My question is how to integrate the above systems by which adapters and which will be the best approach. If we use PI, the major advantage is we can Monitor the flow of the messages .
    Cant the BW System directly talk with the SAP EP System. If there is an approach, can we monitor the message flow with out PI involvement.
    All the above mentoined systems(SAPEP,SAPPI,SAPBW,SAPSRM,SAPABAP) are in the client landscape only.
    Please suggest me the approach.
    Thanking you

    Hi rajshekar
    Thanx for the reply
    In the current project we are having an Interafce which is a B pass sceanrio from MDM to BI; MDM is dat , pi is picking the data by Sender File CC and Posting the Data to BW by using Receiver File CC. this does not involve any developemnt of ESR Objects.
    My current requirement also seems to be the same Pick thhe date from SP Portal and give it to BW for generaiting reports.
    Can i use the SOAP at sender side and File at receiver side with out the involvement of ESR Objects.
    Does the By Pass aprroach work for this aprticular interafce.
    How can we integrate SAP EP with BW withour Pi?
    please share any blogs or forums
    please help
    Thank you
    Edited by: VArjun86 on Jan 4, 2012 10:27 AM

  • Breakout of data for Webi Report

    I am not sure if this question should go here or in the Universe forum but I will post it here first.  I am using BOBJ Edge XI3.1 with SQL Server 2005 holding my data.
    I have a universe that has 4 tables in it.  The first table is "CSS_Facts" which is a Fact table consisting of aggregate data.  Table two is "TJG_Organization_CSS"  that holds six levels of hierarchy for all clients.  Corporation, System, Facility, Division, Service Line and Department.  Department is the lowest level so there is an "Organization_Key" for every department for every client.  Table three is "TJG_Survey_CSS" which holds all of the survey data for every client.  Every answer for every survey has its own "Survey_Key".  Lastly I have "TJG_Time" which has a "Time_Key" for every day of the year.
    Table "CSS_Facts" has a foreign key for each of the three tables above with the same field name as the key field in each of those tables.  The cardinality for all three of these joins is many to 1 from the "CSS_Facts" table (ex:  many "Survey_Keys" from the Facts table to one key in "TJG_Survey_CSS").  The facts table also has 4 other fields which hold the counts and or computed fields.
    All of this works perfectly for most of what we do.  I can build a WEBI document that has all of the fields that I need.  As an example, I can get a report of all the statements for a given timeframe for a specific survey number for a specific client.  If the survey has 50 statements in it I can get back all of the statements with the counts for all of the possible answers for each statement.  This is expected and what we want.
    Our problem is being able to break this down based upon the answer(s) to one question in the survey.  Let's say statement number 2 in the survey is an age statement that has 5 possible answers.  Answer 1 might be "Less than 18.".  Answer 2 might be "19 to 29.", and so on.  When I filter my query in my Webi to just bring back results for anyone who chose Answer 1 (Less than 18) for this statement I get back pretty much what I expect, that question only with all of the possible answers but a computed total for just answer 1 (less than 18) with all of the other answers getting zeroes.  Again, this is what I expect to get back based on the WHERE clause that is generated but not really what we want.  We want to get back the complete survey in the report, all 50 statements with new count totals per statement based on the answer (Less than 18) from that 1 question.
    Is this possible to do within Webi reports?  If not, can this be done outside of a webi report, but then brought back into the Universe to be consumbed by a Webi report?
    Thanks for any help in advance.

    You can use the following work-around without the need for an SDK solution.
    Display this info in the report in stead of in the pop-up...
    Just create a cell with the info want to show and hide it if there is data,
    (use and alerter to empty it when you have data and hide on empty)
    show it when there is no data fetched.
    Hope this helps,
    Marianne

  • How to get data for Planning reports in 'CRM Interactive BI Reports' ?

    Hi ,
    Can you please let me know how can we get the data for the Planning Reports?
    Do we have any specific data in the BI system related to Planninin Reports?
    Planning Reports are :
    1. Plan/Actual Comparison
    2. Plan/Actual Monitor
    3. Plan/Actual Analysis
    4. Plan/Actual Compar. (YTD/YTG)
    5. Sales Volume Forecast
    6. Plan/Actual Net Revenue/Cust.
    7. Plan/Actual Net Revenue/Prod.
    Thanks
    Ravi

    We should have a separate BI system to get the data for all these reports.
    Regards
    Ravi

  • Data source Infocubes for standard reports

    Hello Everybody,
    Can you please assist - how to understand which data source will feed - which info-cube - which standard reports?
    I am planning to activate standard reports in various module. How do I decide corresponding data source and info-cube relevant to these reports?
    Thanks for your help.
    TS

    Hi ,
    If you know the standard query name , then you can get the data provider name in the " information " tab ( bex Analyser). Using the infoprovider go to Business content > give , in data flow before and then install the infoprovider and all the relavant objects.Then go to RSA1 and check the infoprovider, right click on the infoprovider >display data flow
    Hope this helps.
    Thanks and regards

  • Use JDBC to query data for JSP Report

    Hi all,
    I met a trouble when use JDBC to query data,
    it can show data in report builder, but get error when call from url for exxample: http://localhost:8889/reports/TestJDBCReport.jsp
    found error message:
    javax.servlet.jsp.JspException: rwlib-1: REP-4100: Failed to execute data source. REP-0069: Internal error JDBCPDS-62000:Invalid sign-on parameter P_JDBCPDS
    javax.servlet.jsp.JspException: rwlib-1: REP-4100: Failed to execute data source.
    REP-0069: Internal error
    JDBCPDS-62000:Invalid sign-on parameter P_JDBCPDS
    anyone know pls help me
    Many thanks

    As a general rule, it's a good idea to separate the
    presentation (JSP and HTML) from the business rules
    (database access). I know you didn't do that on the
    AS/400, you had display files and business logic in
    the same program (at least, we certainly do in ours),
    but it's a good policy to follow in the web world.
    That means, don't put your database access code in
    the JSP. Other than that, it depends on the data.
    If you have simple data (e.g. customer's name and
    d address) then a Java bean would suffice. If you
    have complex data (e.g. customer's payment history)
    then a bean still might suffice. You would use an
    "include" if you had some data (static or dynamic)
    that you wanted to appear in several different pages
    in the same form.Thanks, I figured putting the code in the JSP was not the best way, but I wasn't sure about the other options.

  • Extract BPC 5.0 data for BI reporting

    Hi,
    I want to know, in order to extract data from a BPC 5.0 (microsoft version) for reporting in BI tools do I need to connect to underlying SQL tables or can I leverage the Analysis Services cubes as OLAP source.
    If it is SQL tables, how is the application database structured in terms of dimensions and facts. How easy it is to figure them out and build a relationship between the tables in the BI modeling layer.
    Appreciate you taking time to answer.

    Within BPC's standard functionality delivered in ApShell, there are two packages
    "Export" - exports from the OLAP cube (aggregate members, choose your measure, etc.)
    "Export from fact tables" - exports.. .well..  from the fact tables!
    Both of these generate a text file and allow you some flexibility to set up dimension mappings and some other useful features.
    You should certainly use one or both of those, if at all possible, since they involve no custom development. The data manager guide gives fairly complete documentation of the options available.
    Or if you need a more elegant / sophisticated / integrated export routine, you can go either to the underlying SQL tables, or OLAP cube, or some combination of both, using DTS programming for automation.
    In either approach, the SQL fact tables will be faster than the OLAP cube. I've never tried a customized OLAP export, but have done a few SQL custom exports, which give complete flexibility to map data against conversions using member properties, aggregate data from BPC base members to the export system's base members, etc. using standard SQL in a stored proc. (You could also do these using DTS, but I find it easier to stage the export using SQL; just my personal preference.)
    I'm not sure, in an OLAP export, how you'd access that kind of metadata information.
    The key tables are
    tblFactAppName
    tblFac2AppName
    tblFactWBAppName
    and then
    mbrAccount
    mbrTime
    mbrDimension
    etc. for each dimension

  • Creation of summarization data for PA report

    Dear experts,
    I am facing the following problem with PA. I am trying to create a report using KE30, which will utilize summarization data. I then use KE3Q to run this in the background using only one period. Despite what ever I do, after some hours the background job is cancelled. If I use ST22 I get the following error desc: EXPORT_COMPRESS_ERROR
    Although I checked for the above, I couldn't find anything in combination with the PA...
    Any ideas;
    Best Regards
    Peter

    HI Peter,
    Can you try by executing the report in foreground from KE30.
    Better to have Sales Org, product, customer, Fiscal year, period, etc. in selection screen.
    If it is executing in foreground for 1 customer or product, it means there is no issue in report. If it is not able to execute for 1 customer then you have to check the COPA Form and Report.
    BR,
    ADI

  • Source data for Legal and Management Consolidation

    Hi,
    I'm in ECC5, using BCS 4.0 and BW 3.5.
    Our current designed required 2 type consolidation, which is company consolidation and profit centre consolidation. Note that the profit centre consolidation also required balance sheet and profit/loss.
    Now, I know that basicly the source of data coming from R/3 is actually the special ledger table FAGLFLEXT. In this table, both company and profit centre shared the same table in order to maintain data consistency.
    My question is:
    1. Is my understanding about FAGLFLEXT correct?
    2. What are the prerequisites steps so that the table FAGLFLEXT can have the profit centre data inside?
    Any advise please....
    regards,
    Halim

    Hi Halim,
    Yes, you are right.
    As a prerequisite, you need to activate new General Ledger Accounting in Customizing for Financial Accounting in the OLTP system:
    http://help.sap.com/saphelp_nw04/helpdata/en/be/928f40f5767d17e10000000a1550b0/frameset.htm
    http://help.sap.com/saphelp_erp2005/helpdata/en/b6/5f58405e21ef6fe10000000a1550b0/frameset.htm
    See here an example of configuration:
    http://help.sap.com/bp_bblibrary/500/documentation/N70_BB_ConfigGuide_EN_DE.doc
    here a presentation on GL in mySAP ERP:
    http://www.asug.com/client_files/DocUploads/search/doc1194.ppt
    and here a thread about dataflow from R/3 to BCS:
    http://eai.ittoolbox.com/groups/technical-functional/sap-r3-sem/dataflow-from-r3-to-sem-bcs-950671
    Best regards,
    Eugene

  • Source data for Goods Reciept ledger transactions

    Hi,
    I've been tasked with writing a custom report simular to transaction KSB1 ( Display actual cost line items for cost centers )but with vendor no & name added.
    I can write the ABAP but need some help to identify the source tables.
    The fields on the report below look like they are from the accounts payable ledger but I'm not sure which table that is.  Any help greatly appreciated.
    Cost Center
    Cost Element
    Period
    Cost Element Name
    Document Type
    Document No.
    User
    Purchase Order Text
    Purchasing Document
    Document Header Text
    Value in Report Currency

    Hi,
    You will find this information in the MSEG table: position details for Material Documents (Header information you will find in MKPF)
    I think this is a better strategy than using the GL lineitems for the vendor.
    Goodluck,
    Paul

  • 0HR_PT_2 How to get back historical data for new report time type

    Hi All Expert,
    We have implemented and using the 0HR_PT_2 extractor for the past whole year. The Delta is working. Recently, there is requirement to read more data from the ZL custer table, and more new BW Report Time Types are added to extract such data.
    The delta doesn't pick up the the past whole year data of the new report time type that we added.
    Do we need to re-initialize the load to get back those historical data every time when we add a new report time type?
    Please advice and Thx
    Ken
    Edited by: Ken Hong on Feb 27, 2008 9:24 PM
    Edited by: Ken Hong on Feb 27, 2008 9:25 PM

    P.s, all hidden files are shown in es file explorer as this backup folder was hidden originally. It has a '.' in front. So I'm pretty sure the folder it's gone, but as I've not erased my phone again, shouldn't the folder be somewhere in my SD card still and how can I find it using my Mac?

Maybe you are looking for