Multi data sources with a mismatch of fields

Post Author: [email protected]
CA Forum: General
Hi,
I'm writing a crystal report (using crystal XI), pulling data from two different databases.  The first issue is how to display these concurrently - I'm guessing the best option is to build them as two sub reports?
The report i'm hoping to produce will give usage summarys for stock held in our warehouse, for the current month, then the preceeding 12 months, in the following order:
Stock No,  This months usage,  Last months usage, etc  along with some addditional information about the stock item
The first datbase is nice and easy, the table I'm pulling the data from, I have the product code for the stock items, followed by the last 36 months summarised history, so I can put it into a report as follows:
Stock no     Current months usage   Previous months usage   Previous month - 1 usage    Previous month - 2 usage
Not the best format, but it would do.... but is there any way of crystal working out from the current date, what the current month is, what the previous month, etc is, and changing the displayed name of these to actual months, so for example, if I ran the report today, it would display March, instead of current month etc?
The next problem, is the second database has this usage in two possible tables.  In one, every transaction is shown, so I could use a filter to select only usage, than group the data into months, however how do I then display this across the report, so it matches up with the first database?
I thought of using a cross tab report, but I need to pull in other information as well as just usage, and can't see any way to do that...
The second option is to use a tabkle that summarises the usage. problem there is again, that the usage isn't in the same order as in the first report.
In this table, I have Stock No,  This Year Period 1 Usage,  This Year period 2 Usage, etc, Last year Period 1 Usage, etc.
Can I get crystal to dynamically chosse which fields go in there, so I only show the current months usage, with the last 12 months usage, so thats its lined up with the first report?
Hope that made some kind of sense.. but thankyou anyone in advance if you're able to help...

Post Author: [email protected]
CA Forum: General
Ok...
heres some examples of the data:
From the first database:
mrpstkstatus.stock_no
mrpstkstatus.tyr_issues_p1
mrpstkstatus.tyr_issues_p2
mrpstkstatus.tyr_issues_p3
mrpstkstatus.tyr_issues_p4
mrpstkstatus.tyr_issues_p5
9908315/001
3,200.00
0.00
0.00
0.00
0.00
2376
1,720.00
0.00
0.00
0.00
0.00
7701112/002
3.00
503.00
0.00
0.00
0.00
8801460/004
3,960.00
4,729.00
0.00
0.00
0.00
8801519/003
32,488.00
488.00
0.00
0.00
0.00
9901275/008
87,120.00
27,705.00
0.00
0.00
0.00
9902128/011
6,752.00
39,606.00
0.00
0.00
0.00
9902524/002
1,200.00
0.00
0.00
0.00
0.00
9902524COM
1,200.00
0.00
0.00
0.00
0.00
9903869/005
35,240.00
27,160.00
0.00
0.00
0.00
9904392/003
20,000.00
0.00
0.00
0.00
0.00
9904475/002
451.00
327.00
0.00
0.00
0.00
And the second:
tpl_pics.Product_code
tpl_pics.Current_month_usage
tpl_pics.Previous_month_usage
tpl_pics.Previous_month_usage__1
tpl_pics.Previous_month_usage__3
106324
0
0
0
0
95065
0
0
0
0
182054
0
0
0
0
120728
0
0
0
0
85224
0
28
49
24
152143
0
0
0
0
181967
0
0
0
0
152149
0
0
0
0
182123
0
0
0
0
Ideally, I need to get them running concurrently, and looking something more like:
mrpstkstatus.stock_no
January
Febuary
March
April
9908315/001
3200
0
0
0
2376
1720
0
0
0
7701112/002
3
503
0
0
8801460/004
3960
4729
0
0
8801519/003
32488
488
0
0
9901275/008
87120
27705
0
0
9902128/011
6752
39606
0
0
9902524/002
1200
0
0
0
9902524COM
1200
0
0
0
9903869/005
35240
27160
0
0
9904392/003
20000
0
0
0
9904475/002
451
327
0
0
106324
0
0
0
0
95065
0
0
0
0
182054
0
0
0
0
120728
0
0
0
0
85224
0
28
49
24
152143
0
0
0
0
181967
0
0
0
0
152149
0
0
0
0
182123
0
0
0
0
Obviously, I've cheated a little, and just copied and pasted, so the data no longer actually matches up - but hopefully you get the idea...
Thanks again in advance...
Nick

Similar Messages

  • Generic Data Source with Function Module data mismatch

    Hi All,
    I'm using Generic Data Source with Function Module, When I execute the Function Module (Which I have Created), I'm getting 16000 records and when run extractor(in RSA3) im getting different no.of records(infact they are more no.).
    when I run the InfoPackage  in BI im Getting more no. of records than what i got executing the function module..
    and single record is divided into 2 records in BI side(not all the records), how can it be possible???
    is there anything Im missing to explain you my issue???
    if understood please help me out.
    Thanks n Regards,
    ravi.

    HI rkiranbi,
    1. FIrst you excute function module according to your paramers, you will get some records. then goto tcode RSA3 --> excute
    Provide your Data source name and under setting we have options like Data records/calls, Display extractor calls and selections --> fields .
    in that options you have to increase the values. and then you have to pass paramers in RSA3 according to your function module
    selections in SE37. Now you will get equal values in both functin module selection and RSA3 Selection. if it fail means  you need to
    check coding logic in function module. 
    2. if your  are getting wrong values in BI System then check with
                  1. compare with PSA data and data target data (here you need to check with characterstic as well as keyfigures)
                  if you find any mistake you need change the coding in function module according to client requirement.
                  2. compare data with RSA3 and bi report data or data target data.
                                 check it properly above steps, you will get solution.
    thanks and regards,
    malli

  • Generic Data Source with Function Module data mismatch in BI

    Hi All,
    I'm using Generic Data Source with Function Module, When I execute the Function Module (Which I have Created), I'm getting 16000 records and when run extractor(in RSA3) im getting different no.of records(infact they are more no.).
    when I run the InfoPackage in BI im Getting more no. of records than what i got executing the function module..
    and single record is divided into 2 records in BI side(not all the records), how can it be possible???
    is there anything Im missing to explain you my issue???
    if understood please help me out.
    Thanks n Regards,
    ravi.

    the datasource frame work starts the function module several times.
    1. the initialization
    2. the serval times, until you "raise no_more_data".
    check you coding: have you refreshed necessary internal tables.
    Sven

  • Not able to create a Data source with qty filed in the table

    Hi i am having scenario.
        i am doing a Genric R/3 extraction for R/3 system. I created a view from two tables. I have added a net quantity field also in then view table. But when i created a data source for that view it is showing a error message which says that it cannot create a data source with a quantity field in the table.
    Here is my question. How to create a Data Source for a view table which has Qty filed in it.
    Pls help with this.
    Senthil

    Hi Sensai,
       U need to add reference table for the unit .... when added into the view u create .......the reference table name u can find in the table itself from where it is brought into the view.... in the table currency quantity fields........
    U can add that table in the currency /quantity fields of the view....u need to add referecne field and table...and activate ........
    i think it works
    Regards
    vamsi

  • OIM 9.1.0.2 - Weblogic JDBC Multi Data Sources for Oracle RAC

    Does OIM OIM 9.1.0.2 BP07 support Weblogic JDBC Multi Data Sources (Services>JDBC>Multi Data Sources) for Oracle RAC instead of inserting the "Oracle RAC JDBC URL" on JDBC Data Sources for xlDS and xlXADS (Services>JDBC>Data Sources> xlDS|xlXADS > Connection Poll> URL) ?
    If yes, is there are any other modifications that need to be made on OIM, or just changing the data sources?

    Yes, it's supported. You install against one instance directly of the Rac Server. Then you update the config.xml file and the jdbc resource in your weblogic server with the full rac address. It is documented for installation against RAC. http://docs.oracle.com/cd/E14049_01/doc.9101/e14047/database.htm#insertedID2
    -Kevin

  • Multi data source and global transactions

    hi
    i am working on JDBC multidata source connectivity.... to resolve one error of "lock in doubt held "
    was searching through oracle sites got some recommendation as
    **Disable server side load balancing for each of the pools. This can be done by setting the INSTANCE_NAME attribute in your JDBC connect descriptor aliases.**
    **For example:**
    **jdbc:oracle:thin:@(DESCRIPTION=(ADDRESS_LIST=(ADDRESS=(PROTOCOL=tcp)(HOST=MYDBHOST1)(PORT=1522)))(CONNECT_DATA=(SERVICE_NAME=MYDB)(INSTANCE_NAME=MYDB1)))**
    **If you miss the INSTANCE_NAME attribute in your JDBC connect descriptor, the Oracle TNS Listener could still redirect you to some other instance depending on the load on the instance in question.**
    now i m unable to understand which data sources is to be updated with this URL multi data source or the target data sources
    can some body give me some clue........

    Every datasource in a MultiDataSource must have a URL that ensures that every connection
    in a given DataSource is always, only to one specific and unchanging RAC node.

  • Multi Data Source, Fail Over, Read-Only detected as unavailable

    We are using WLS 10.3.6.0 with WebCenter Content (WCC) as the application.
    For the production back end, RAC will be be the production DB and we need to support failover to Data Guard.
    We set up a Multi Data Source (MDS) with the first simple Data Source pointing to RAC and the second Data Source pointing to Data Guard.
    Upon testing, this scenario basically works correctly: when RAC is shutdown and Data Guard enabled, the switch occurs and the app works against Data Guard. And conversely, when Data Guard is shut down and RAC brought up, MDS will revert to pointing to RAC and the app will work correctly.
    However the DBA insists, that the alternate Data Source should not need to be completely down but should be able to be in a read-only state and the MDS should detect that if it is in a read-only state, it will be counted as unavailable and try to use the alternate Data Source. This does not work with the default set up, I see in the logs that a connection can be made to the read-only DB instance which will not work at all with the app and is not what is desired by the customer.
    Is there a way to configure the MDS so that if a Data SOurce is only available as read-only, the alternate Data Source will be used?

    I may be able to describe a trick... Try setting the "test table" property of the DataGuard DataSource
    to some update statement, ideally one to a otherwise unused table. eg:
    SQL update myJunkTable set foo = 1 where 1 = 0
    This won't actually update the table (which could have zero rows even) but I hope the
    read-only DBMS would throw an exception anyway, and this will give WLS the impression
    it can't get healthy connections, and That DataSource won't be used, until the read-only
    status is removed.
    However, from a standard point of view, a read-only DBMS may be perfectly useful for some applications,
    so that is not of itself a reason for WLS not to use it.

  • Generic Data source with Infoset Query

    Hi All,
    Please provide me the links or PPT's or Information or PDF's for the creation of the Generic Data source with Infoset Query or
    Please explain in clear steps how to create Generic Data source with Infoset Query.
    Thanks & Regards,
    Ravi

    Oopps...being a BI guy, always think of BW
    I have never created, though I will suggest to follow this:
    A query can be created to extract information from master records i.e Infotypes. For example, by creating a query , the data relating to an employee contained in various Infotypes can be extracted.
    Proceedure :
    Decide on the various Infotypes we want to make the query. Decide on the area where we want to query i.e Global area or Standard area. Standard area is client specific and globel area will include all clients.
    Menu : HR u2013 PM u2013 Admn - Information System - Adhoc Query
    Select area standard and select the user group already created
    Creation of new query :
    TC SQ03 - Select Environment u2013 Select Standard Area - Enter -- If new user group is to be created, enter name of the user group, click on create and enter necessary information and exit after saving
    TC SQ02 - Enter name of the Infoset u2013 Create u2013 enter name of Infoset - Data source -- > Table join by basis table u2013 give name of table e.g pa0000 - Enter - Click on insert table if we want to include more tables u2013 give name of table one by one and after finishing, place cursor on the joining lines and right click to delete unwanted relationships - check - and go back - field groups - include all table fields - click on generate button - go out
    TC SQ03 - Select user group - eg. Payroll
    Infoset - Enter name of newly created Infoset
    Assign users and Infosets - Assign infosets - put tick on payroll - save and go back
    TC PAAH - Expand the nodes and put tick on relevant fields depending upon necessity
    Save the query by giving the same name as infoset for easyness..
    Thanks...
    Shambhu

  • Errors in the high-level relational engine. A connection could not be made to the data source with the DataSourceID of '

    When I deploy the cube which is sitting on my PC (local) the following 4 errors come up:
    Error 1 The datasource , 'AdventureWorksDW', contains an ImpersonationMode that that is not supported for processing operations.  0 0 
    Error 2 Errors in the high-level relational engine. A connection could not be made to the data source with the DataSourceID of 'Adventure Works DW', Name of 'AdventureWorksDW'.  0 0 
    Error 3 Errors in the OLAP storage engine: An error occurred while the dimension, with the ID of 'Customer', Name of 'Customer' was being processed.  0 0 
    Error 4 Errors in the OLAP storage engine: An error occurred while the 'Customer Alternate Key' attribute of the 'Customer' dimension from the 'Analysis Services Tutorial' database was being processed.  0 0 

    Sorry hit the wrong button there. That is not entire solution and setting it to default would work when using a single box and not in a distributed application solution. If you are creating the analysis database manually or using the wizard then you can
    set the impersonation to your heart content as long as the right permission has been set on the analysis server.
    In my case I was using MS Project Server 2010 to create the database in the OLAP configuration section. The situation is that the underlying build script has been configured to use the default setting which is the SQL Service account and this account does
    not have permission in Project Server I believe.
    Changing the account to match the Project service account allowed for a successful build \ creation of the database. My verdict is that this is a bug in Project Server because it needs to include the option to choose impersonation when creating the Database
    this way it will not use the default which led to my error in the first place. I do not think there is a one fix for all in relations to this problem it is an environment by environment issue and should be resolved as such. But the idea around fixing it is
    if you are using the SQL Analysis server service account as the account creating the database and cubes then default or service account is fine. If you are using a custom account then set that custom account in the impersonation details after you have granted
    it SQL analysis administrator role. You can remove that role after the DB is created and harden it by creating a role with administrative permissions.
    Hope this helps.

  • An error occurred querying a data source - with REST services

    Hi,
    I have a SharePoint 2013 form library library with an info-path form. I need to get the logged in user's 'Display Name' on my form load automatically.
    I used REST service to fetch the current user details. In the preview mode of the form, its showing the right name. But when I publish this form to library I am getting the following error.
    REST Service --> http://site url/_api/SP.UserProfiles.PeopleManager/GetMyProperties
    Please help me to resolve this issue.
    Thanks in advance for your time and reply :)

    Hi,
    According to your post, my understanding is that an error occurred querying a data source with REST services.
    It is defiantly permission issue with GetUserProfileByName service
    and could be many reasons of this problem. You first try with UDCX file and make sure that UPS is running.
    Here are some similar threads for your reference:
    http://social.technet.microsoft.com/Forums/en-US/b8c668ea-7511-4657-a1a8-08fb4a6bd53d/info-path-an-error-occurred-querying-a-data-source?forum=sharepointcustomizationprevious
    http://social.technet.microsoft.com/Forums/en-US/46866ac2-da09-4340-a86a-af72cbb2c8d7/info-path-an-error-occurred-querying-a-data-source-?forum=sharepointcustomization
    http://blogs.msdn.com/b/russmax/archive/2012/08/17/want-to-call-sharepoint-2010-web-services-within-browser-based-infopath-2010-forms.aspx
    Best Regards,
    Linda Li
    Linda Li
    TechNet Community Support

  • Shared Data Source with prompted credentials - errors out.

    Using Data Tools (VS 2010) to build report.
    Have built a simple report while trying to diagnose this issue.  This report is created in a project that was originally built in BIDS2008 and migrated to Data Tools 2010.  This project contains an original shared dataset that does NOT
    error out.
    Report contains
         1) shared data source with properties set to prompted credentials.
         2) shared dataset using data source above
         3) 2 parameters (month and year)
         4) Report uses shared dataset
         5) 2 text boxes in the report body to display the parameters values.
    When I try to preview this report I get the error:
         The execution failed for the shared data set "xxx"
         Cannot create a connection to data source 'Data source for shared dataset'
         Security processing failed with reason "3" ("password missing")
    Now, if I changed that same shared dataset to use a static login and password; the report renders fine.
    What is going on???

    Hi RoseNeedsAVacation,
    When you use shared data source with "Prompt for credential" option, please use SQL Server login credential to u view the report in BIDS or SSDT report designer environment.
    Note: The shared data source will automatic switch to SQL Server authentication mode if we use "Prompt for credential" option.
    After you deploy your report and shared data source to report server, please remember to configure your shared data source to use "Credentials supplied by the user running the report" option. Furthermore, select "Use
    as Windows credentials when connecting to the data source" checkbox if the credentials that the user provides are Windows Authentication credentials. Do not select this check box if you are using database authentication (for example, SQL
    Server Authentication).
    For more information, please refer to the article below:
    New Data Source Page (Report Manager):
    http://technet.microsoft.com/en-us/library/ms180077(v=sql.100).aspx
    Regards,
    Elvis Long
    TechNet Community Support

  • Using JDBC Data Sources with ADFBC, NoInitialContextException

    Using JDBC Data Sources with ADF Business Components, NoInitialContextException
    I follow the instruction in the link below to create an ADF Swing application using datasource. I am using JDeveloper version 10.1.3.
    http://www.oracle.com/technology/products/jdev/howtos/10g/usingdatasources/using_datasources.html
    The ADF generated code looks like this:
    JUMetaObjectManager.setErrorHandler(new JUErrorHandlerDlg());
    JUMetaObjectManager mgr = JUMetaObjectManager.getJUMom();
    mgr.setJClientDefFactory(null);
    BindingContext ctx = new BindingContext();
    ctx.put(DataControlFactory.APP_PARAM_ENV_INFO, new JUEnvInfoProvider());
    ctx.setLocaleContext(new DefLocaleContext(null));
    HashMap map = new HashMap(4);
    map.put(DataControlFactory.APP_PARAMS_BINDING_CONTEXT, ctx);
    mgr.loadCpx("datasource.view.DataBindings.cpx" , map);
    final FormMain frame = new FormMain();
    frame.setBindingContext(ctx);
    I got this error when executing the last line: frame.setBindingContext(ctx);
    (oracle.jbo.common.ampool.ApplicationPoolException) JBO-30003: The application pool (datasource.datamodel.AppModuleDS) failed to checkout an application module due to the following exception:
    ----- LEVEL 1: DETAIL 0 -----
    (oracle.jbo.JboException) JBO-29000: Unexpected exception caught: oracle.jbo.DMLException, msg=JBO-27200: JNDI failure. Unable to lookup Data Source at context jdbc/xe_hrDS
    ----- LEVEL 2: DETAIL 0 -----
    (oracle.jbo.DMLException) JBO-27200: JNDI failure. Unable to lookup Data Source at context jdbc/xe_hrDS
    ----- LEVEL 3: DETAIL 0 -----
    (javax.naming.NoInitialContextException) Need to specify class name in environment or system property, or as an applet parameter, or in an application resource file: java.naming.factory.initial
    If I configure the application module connection type as JDBC URL, everything works.
    If the connection type is JDBC Datasource, I got the above error.
    Can someone show me how to adjust the generated code by ADF to use datasource?

    ADF BC has a bug. With Data Source in Application Module, application module does not connect. Use JDBC Connection URL.
    Also refer
    ADF BC: JDBC URL vs JDBC DataSource

  • Generic data source with float field possible?

    Hello,
    when creating an generic data source using a view with a float field I get error R8359 (extract structure: You tried to generate an extract structure with the template structure .... This operation failed, because the template structure quantity fields or currency fields, for example, field ... refer to a different table.).
    I changed the data element from ATFLV to e.g. FLOAT but it did not help.
    SAP hint 335342 deals with this issue, but I just want to use the float number without the unit.
    Is this possible or do I need to write a function module?
    Best regards
    Thomas

    Hi,
    you could try to add the unit table and field to your view. When saving the datasource in RSO2, you can choose to hide these fields if you don't want them extracted into BW.
    Regards,
    Øystein

  • Generic Data Source with View

    Hi Experts.....
         previously i am creating one view based on VBRP & VBRK common field is VBELN but i have some confusion long days, these two tables having same data Source i.e..2LIS_13_VDHDR , why u create View. So please explain one real time sinario.

    Hi,
    When we don't find any standard extractor then we can go for Generic(if i want information sales along with finance information in a data source then generally we dont get standard one hence we can go for generic DS)
    Re: Extraction, Flat File, Generic Data Source, Delta Initialization
    Eg : If you want the information about the All Customers  across the regions in that case if i have the information
    Like 1)Table 1 has all the information  about  the Customer number But not having the Customer address and region and Pin code but same information has in other table
    table 2 : Customer no,Address, region and Pincode
    So since in Two tables i have common field Customer no is present hence if create view then i can get All the information in single in view then you can create Generic DS based on the then same you ca extract the data to BW.
    Regards,
    Satya

  • Getting 401 error while creating a Report Data Source with MOSS 2010 Foundation

    I have setup SQL Server 2008 R2 Reporting Services with SharePoint 2010 Foundation in SharePoint integrated mode. SharePoint Foundation is in machine 1 whereas SQL Server 2008 R2 and SSRS Report Server are in machine 2. While configuring Reporting
    Services - Sharepoint integration, I have used Authentication Mode as "Windows Authentication" (I need to use Kerberos).
    My objective is to setup a Data Connection Library, a Report Model Library, and a Reports Library so that I can upload a Report Data Source, some SMDLs, and a few Reports onto their respective libraries.
    While creating the top level site, "Business Intelligence Center" was not available for template selection since SharePoint Foundation is being used. I therefore selected "Blank Site" as the template.
    While creating a SharePoint Site under the top level site, for template selection I again had to select "Blank Site".
    I then proceeded to create a library for the data connection. Towards this, I created a new document library and selected "Basic page" as the document template. I then went to Library Settings for this newly created library and clicked on
    Advanced Settings. In the Advanced Settings page, for "Allow management of content types?" I selected "Yes". Then I clicked on "Add from existing content types" and selected "Report Data Source". I deleted the existing
    "Document" content type for this library.
    Now I wanted to created a Data Connection in the above Data Connection library. For this when I clicked on "New Document" under "Documents" of "Library Tools" and selected "Report Data Source", I got the error "The
    request failed with HTTP status
    401: Unauthorized.".
    Can anybody tell me why I am getting this error?
    Note: I have created the site and the library using SharePoint Admin account.

    Hi,
    Thank you for your detailed description. According to the description, I noticed that the report server was not part of the
    SharePoint farm. Add the report server to the
    SharePoint farm and see how it works.
    To join a report server to a SharePoint farm, the report server must be installed on a computer that has an instance of a SharePoint product or technology. You can install the report server before or after installing the SharePoint product
    or technology instance.
    More information, see
    http://msdn.microsoft.com/en-us/library/bb283190.aspx
    Thanks.
    Tracy Cai
    TechNet Community Support

Maybe you are looking for

  • Copy and Clear in ASO

    I'm very used to doing copy and clear in BSO calc script. When it comes to ASO, I'm not sure where and how I can do that? Any help would be appreciated. Thank you!

  • Transfer photos from seagate back to computer. Message i get is they are not recogizable

    a few months ago I transfered my iphoto library to a seagate external drive, Since then i think my iphoto had an update. I tryed to transfer then back to my computer and they message is saying they are not recognizable. Please help.

  • Deleted App Store

    My daughter deleted the app store from the Ipod. How do I get it back onto the ipod to download more apps I can not find the icon for the store anywhere. Thanks for the help.

  • Why is my iPod touch 5th gen not responding?

    I pushed the sleep/wake button and then I came back to use it. It didn't respond when I pushed the home button and the sleep/wake button and I've tried turning it on. That doesn't work and my iPod is completely unresponsive and blank. I've connected

  • FPM Question

    Hi! I am building a WebDynpro ABAP Application using the Floorplan Manager. I have an OIF with several main views and the FPM creates a tab for every main view. The WD view which corresponds to each main view will be initialized if the user clicks on