Multiple data sets: a common global dataset and per/report data sets

Is there a way to have a common dataset included in an actual report data set?
Case:
For one project I have about 70 different letters, each letter being a report in Bi Publisher, each one of them having its own dataset(s).
However all of these letters share a common standardized reference block (e.g. the user, his email address, his phone number, etc), this common reference block comes from a common dataset.
The layout of the reference block is done by including a sub-llayout (rtf-file).
The SQL query for getting the dataset of the reference block is always the same, and, for now, is included in each of the 70 reports.
Ths makes maintenance of this reference block very hard, because each of the 70 reports must be adapted when changes to the reference block/dataset are made.
Is there a better way to handle this? Can I include a shared dataset that I would define and maintain only once, in each single report definition?

Hi,
The use of the subtemplate for the centrally managed layout, is ok.
However I would like to be able to do the same thing for the datasets in the reports:
one centrally managed data set (definition) for the common dataset, which is dynamic!, and in our case, a rather complex query
and
datasets defined on a per report basis
It would be nice if we could do a kind of 'include dataset from another report' when defining the datasets for a report.
Of course, this included dataset is executed within each individual report.
This possibility would make the maintenance of this one central query easier than when we have to maintain this query in each of the 70 reports over and over again.

Similar Messages

  • Dynamically generating the ssrs dataset and filling the data into the dataset and binding it to ssrs report dynamically

    I have a work to do, in ssrs we are using server reports in our project. i am looking for dynamically generating the ssrs dataset and filling the data into the dataset and binding the dataset to ssrs report(RDL) dynamically.
    Getting the dataset dynamically has a solution by using Report Definition Customization Extension (RDCE), but the problem is binding that dataset to the report(RDL) dynamically was not there.
    Here is the reference for RDCE http://www.codeproject.com/Articles/355461/Dynamically-Pointing-to-Shared-Data-Sources-on-SQL#6
    I looked for binding the dataset to the report(RDL) dynamically and searched many sites but i did not get the solution. Can anyone help me here.
    Is there any custom assemblies or any Custom data processing extensions to work around. Please help.
    Thanks in advance

    Hi Prabha2233,
    Thank you for your question.
    I am trying to involve someone more familiar with this topic for a further look at this issue. Sometime delay might be expected from the job transferring. Your patience is greatly appreciated.
    Thank you for your understanding and support.
    Regards,
    Vicky Liu
    Vicky Liu
    TechNet Community Support

  • To find the date type fields in the row and validate those date fields

    TYPES : BEGIN OF TY_MARA,
              MATNR TYPE MARA-MATNR,
              ERSDA TYPE MARA-ERSDA,
              ERNAM TYPE MARA-ERNAM,
              LAEDA TYPE MARA-LAEDA,
              MTART TYPE MARA-MTART,
            END OF TY_MARA.
    DATA : it_mara TYPE STANDARD TABLE OF ty_mara,
          it_mara1 TYPE STANDARD TABLE OF ty_mara,
           wa_mara TYPE ty_mara.
    loop at it_mara into wa_mara.
      describe field wa_mara-ersda type c_data.
    if c_data eq 'D'.
      CALL FUNCTION 'DATE_CHECK_PLAUSIBILITY'
        EXPORTING
          date                            = wa_mara-ersda
       EXCEPTIONS
         PLAUSIBILITY_CHECK_FAILED       = 1
         OTHERS                          = 2
      IF sy-subrc eq 0.
    wa_mara-ersda = '00000000'.
        append wa_mara to it_mara1.
        write :wa_mara-matnr,wa_mara-ersda.
        else.
            wa_mara-ersda = '00000000'.
        append wa_mara to it_mara1.
        write :wa_mara-matnr,wa_mara-ersda.
      ENDIF.
      endif.
      endloop.
    This issue regarding how to find the date type fields in the row and validate those date fields.If its not a valid date ,i have to assign initial value to that.
    I've tried that for single field using describe field.Please help me do that for all fields.

    Hi Sam,
     I believe we had discussed the same issue in the below thread. Can you please refer the below one?
    http://social.msdn.microsoft.com/Forums/sharepoint/en-US/d93e16ff-c123-4b36-b60b-60ccd34f6ca7/calculate-time-differences-in-infopath?forum=sharepointcustomizationprevious
    If it's not helping you please let us know
    Sekar - Our life is short, so help others to grow
    Whenever you see a reply and if you think is helpful, click "Vote As Helpful"! And whenever
    you see a reply being an answer to the question of the thread, click "Mark As Answer

  • UCCX 7.0.1SR5 to 8.0 upgrade while also adding LDAP integration for CUCM - what happens to agents and Historical Reporting data?

    Current State:
    •    I have a customer running CUCM 6.1 and UCCX 7.01SR5.  Currently their CUCM is *NOT* LDAP integrated and using local accounts only.  UCCX is AXL integrated to CUCM as usual and is pulling users from CUCM and using CUCM for login validation for CAD.
    •    The local user accounts in CUCM currently match the naming format in active directory (John Smith in CUCM is jsmith and John Smith is jsmith in AD)
    Goal:
    •    Upgrade software versions and migrate to new hardware for UCCX
    •    LDAP integrate the CUCM users
    Desired Future State and Proposed Upgrade Method
    Using the UCCX Pre Upgrade Tool (PUT), backup the current UCCX 7.01 server. 
    Then during a weekend maintenance window……
    •    Upgrade the CUCM cluster from 6.1 to 8.0 in 2 step process
    •    Integrate the CUCM cluster to corporate active directory (LDAP) - sync the same users that were present before, associate with physical phones, select the same ACD/UCCX line under the users settings as before
    •    Then build UCCX 8.0 server on new hardware and stop at the initial setup stage
    •    Restore the data from the UCCX PUT tool
    •    Continue setup per documentation
    At this point does UCCX see these agents as the same as they were before?
    Is the historical reporting data the same with regards to agent John Smith (local CUCM user) from last week and agent John Smith (LDAP imported CUCM user) from this week ?
    I have the feeling that UCCX will see the agents as different almost as if there is a unique identifier that's used in addition to the simple user name.
    We can simplify this question along these lines
    Starting at the beginning with CUCM 6.1 (local users) and UCCX 7.01.  Let's say the customer decided to LDAP integrate the CUCM users and not upgrade any software. 
    If I follow the same steps with re-associating the users to devices and selecting the ACD/UCCX extension, what happens? 
    I would guess that UCCX would see all the users it knew about get deleted (making them inactive agents) and the see a whole group of new agents get created.
    What would historical reporting show in this case?  A set of old agents and a set of new agents treated differently?
    Has anyone run into this before?
    Is my goal possible while keeping the agent configuration and HR data as it was before?

    I was doing some more research looking at the DB schema for UCCX 8.
    Looking at the Resource table in UCCX, it looks like there is primary key that represents each user.
    My question, is this key replicated from CUCM or created locally when the user is imported into UCCX?
    How does UCCX determine if user account jsmith in CUCM, when it’s a local account, is different than user account jsmith in CUCM that is LDAP imported?
    Would it be possible (with TAC's help most likely) to edit this field back to the previous values so that AQM and historical reporting would think the user accounts are the same?
    Database table name: Resource
    The Unified CCX system creates a new record in the Resource table when the Unified CCX system retrieves agent information from the Unified CM.
    A Resource record contains information about the resource (agent). One such record exists for each active and inactive resource. When a resource is deleted, the old record is flagged as inactive; when a resource is updated, a new record is created and the old one is flagged as inactive.

  • How do I read from a text file that is longer than 65536 lines and write the data to an Excel spreadshee​t and have the data write to a new column once the 65536 cells are filled in a column?

    I have data that is in basic generic text file format that needs to be converted into Excel spreadsheet format.  The data is much longer than 65536 lines, and in my code I haven't been able to figure out how to carry over the data into the next column.  Currently the conversion is done manually and generates an Excel file that has a total of 30-40 full columns of data.  Any suggestions would be greatly appreciated.
    Thanks,
    Darrick 
    Solved!
    Go to Solution.

    No need to use nested For loops. No need for any loop anyway. You just have to use a reshape array function. The picture below shows how to proceed.
    However, there may be an issue if your element number is not a multiple of the number of columns : zero value elements will be added at the end of the last column in the generated 2D array. Now the issue depends on the way you intend store the data in the Excel spreadsheet : you could convert the data as strings, replace the last zero values with empty strings, and write the whole 2D array to a file (with the .xls extension ) using the write to spreadsheet function. Only one (minimal) problem : define the number of decimal digits to be used;
    or you could write the numeric array directly to a true Excel spreadsheet, using either the NI report generation tools or ActiveX commands, then replace the last elements with empty strings.
    We need more input from you to decide how to solve these last questions. 
    Message Edité par chilly charly le 01-13-2009 09:29 PM
    Chilly Charly    (aka CC)
             E-List Master - Kudos glutton - Press the yellow button on the left...        
    Attachments:
    Example_VI.png ‏10 KB

  • Single Standard data source which extracts material master and vendor master data

    Hi all,
    I have a client requirement where user wants to view material master and vendor master in a single report. Is there any standard data source which extracts both the data together..
    Thanks!
    Arvind

    Hi,
    Two are different master info objects and can't get them from one data source.
    think about multi provider or  info set on two master data objects to club them.
    Info object - 0MATERIAL and  data source - 0MATERIAL_ATTR
    Info object - 0VEDNOR and data source - 0VENDOR_ATTR
    Thanks

  • Oracle Froms, Global Variables and Oracle Reports

    Hi all,
    We are developing a form that will call an Oracle report and we would like to test the value of a global variable that was set in the Oracle Forms in the "Before Parameter Form" in the Oracle report trigger. Can this be done?
    Thanks, Larry

    The "global" var. is global to either a report or a form, not both. You may pass them from one to the other but that's not the session-global thing.
    DC

  • Cannot open file /data_work/shuttle/ab.dat . Please check its path and per

    Hi Experts,
          We are extracting data from R3 and we have an  problem with this error message
    Cannot open file </data_work/shuttle/ab.dat>. Please check its path and permissions.This problem occurs everyday but with different jobs. One day the job runs fine and sometimes fails with this message. All permissions to the file are fine its so intermittent with these jobs that we unable to source the problem. Also after the problem occurs when you re run the job it runs fine.
    Its also hard to re create the issue. BUt happens pretty much  every day during our production schedule
    Does the job server fails to read it??
    We are using DS 12.2.0.
    Hopefully I've explained it properly
    Appreciate your thoughts on this

    How are you tranfering the file? Shared directory? What about trying with the FTP server method?
    (You need an FTP server on your SAP machine.., IIS for Windows, ftpd for Unix)
    This way the FTP protocol would take care of the transport to your DS machine.
    An advantage is that with the FTP method you can put retries into the DSConfig.txt file.
    1571986 - Error: Data Services FTP transfer for ABAP integration is giving error "Data flow <data flow name> FTP could not transfer file" while running an ABAP - Data Services
    Section: Key:
    AL_Engine FTPNumberOfRetry
    AL_Engine FTPRetryInterval
    Norbert

  • Project report and frozen report data - CJEB

    I would like to build frozen report data based on different selection variants as our data volume is too important that we can do it once. Now I get the problem that the last job will delete the saved date. Actually, we have created for each selection variant one report, using the same form but ideally we would like to use only one report with dedicated selection variants and extracts. Is there a possibility to save several results at the same time?
    thanks a lot
    Harald

    several results in same time will not be possible.

  • How to use global classes and display returned data?

    Hello experts,
    I have the following code in a program which accesses a global class (found in the class library). It executes one it's static methods. What I would like to do is to get hold of some elements of the returned data. How do I do that please?
    Your help is greatly appreciated.
    ***Use global class CL_ISU_CUSTOMER_CONTACT
    DATA: o_ref TYPE REF TO CL_ISU_CUSTOMER_CONTACT.
    DATA: dref_tab LIKE TABLE OF O_ref.
    DATA: begin OF o_ref2,
    CONTACTID               TYPE CT_CONTACT,
    P_INSTANCES             TYPE string,
    P_CONTEXT               TYPE CT_BPCCONF,
    P_CONTROL               TYPE ISU_OBJECT_CONTROL_DATA,
    P_DATA                  TYPE BCONTD,         "<<<=== THIS IS A STRUCTURE CONTAINING OTHER DATA ELEMENTS
    P_NOTICE                TYPE EENOT_NOTICE_AUTO,
    P_OBJECTS               TYPE BAPIBCONTACT_OBJECT_TAB,
    P_OBJECTS_WITH_ROLES    TYPE BAPIBCONTACT_OBJROLE_TAB,
    end of o_ref2.
    TRY.
        CALL METHOD CL_ISU_CUSTOMER_CONTACT=>SELECT  "<<<=== STATIC METHODE & PUBLIC VISIBILITY
          EXPORTING
           X_CONTACTID = '000001114875'   "Whatever value here
          RECEIVING
            Y_CONTACTLOG = o_ref
    ENDTRY.
    WHAT I WOULD LIKE TO DO IS TO MOVE o_ref TO o_ref2 and then display:
    1) P_DATA-PARTNER
    2) P_DATA-ALTPARTNER
    How can I do this please?

    I now have the following code. But when I check for syntax I get different error. They are at the end of the list.
    Here is the code the way it stands now:
    ================================================
    ***Use global class CL_ISU_CUSTOMER_CONTACT
    DATA: oref TYPE REF TO CL_ISU_CUSTOMER_CONTACT.
    DATA: dref_tab LIKE TABLE OF oref.
    DATA: begin OF oref2,
    CONTACTID TYPE CT_CONTACT,
    P_INSTANCES TYPE string,
    P_CONTEXT TYPE CT_BPCCONF,
    P_CONTROL TYPE ISU_OBJECT_CONTROL_DATA,
    P_DATA TYPE BCONTD,      "THIS IS A STRUCTURE CONTAINING OTHER DATA ELEMENTS
    P_NOTICE TYPE EENOT_NOTICE_AUTO,
    P_OBJECTS TYPE BAPIBCONTACT_OBJECT_TAB,
    P_OBJECTS_WITH_ROLES TYPE BAPIBCONTACT_OBJROLE_TAB,
    end of oref2.
    TRY.
    CALL METHOD CL_ISU_CUSTOMER_CONTACT=>SELECT     " STATIC METHODE & PUBLIC VISIBILITY
    EXPORTING
    X_CONTACTID = '000001114875' "Whatever value here
    RECEIVING
    Y_CONTACTLOG = oref
    ENDTRY.
    field-symbols: <FS1>      type any table,
                   <wa_oref2> type any.
    create data dref_tab type handle oref.   " <<===ERROR LINE
    assign dref->* to <FS1>.
    Loop at <FS1> assigning  <wa_oref2>.
    *use <wa_orfe2> to transfer into oref2.
    endloop.
    write: / 'hello'.
    =========================================
    Here are the errors I get:
    The field "DREF" is unknown, but there is a field with the similar name "OREF" . . . .
    When I replace itr by OREF I get:
    "OREF" is not a data reference variable.
    I then try to change it to dref_tab. I get:
    "DREF_TAB" is not a data reference variable.
    Any idea? By the way, must there be a HANDLE event for this to work?
    Thanks for your help.

  • GRC AC Emergency Access Management (EAM) and STAD report data

    Dear Community,
    we use EAM (ID based fire fighting) and the Log synchronization jobs are scheduled every half hour in order to get the fire fighter logs from the back-ends for review by the controller. Due to a technical issue the synchronization jobs are not working correctly over several days. We experienced missing session details (executed transactions, programs, changes, etc.) for many Fire fighter sessions. As one the source of of the fire fighter log is STAD on the back end and these data are only buffered 48 hours per default, I expect that I can't recover the logs and they are irreversible lost if GRC is down or the sync-jobs are not running for more that time. That can happen over a weekend....
    I ask you:
    can you confirm my expectation?
    does it make sense to extend the STAD buffer up to e. g. 96 hours for all GRC production back ends?
    have you controls in place to check if the sync-jobs are running and the logs are synchronized correct and complete?
    I would appreciate, if you can share some thoughts with me about this.
    Thanks in advance,
    Andreas Langer

    Hi Andreas,
    - Please check the below note, for missed log entries
    1934127 - GRC10 EAM: EAM recovery program to retrieve missing log and to generate the missing workflows
    - The maximum value is 99, and it is the number of stat files that  are generated. So, you can get records upto 4 days.
    - Periodic Monitoring activity activity can be set, which is done manually. I am not aware if Process Control, can take care of this monitoring.
    regards

  • HT4436 IPAD issue. Set up Icloud on new IPAD which works fine on my mac and phone. HOwever, when going to iCloud on that IPAD, I am sent to the welcome/set up Icloud screen. And icloud is already set up. How do I get the ipad to recognize the Icloud setup

    ?  IPAD/iCloud set up issue. HAVE iCloud working fine on iPhone and mac. New iPad. Set up iCloud through the usual utility. However, when signing on the iCloud I get sent to the initial "Set up iCloud" screen and cannot sign in to my iCloud account - on the iPad only. Works fine on any other device.
    Why?  Anyone know what I might have done wrong?
    Otis

    You can't go to icloud.com on an iOS device like your iPad.  You can only view your data on icloud.com from a computer.  This is because you can already view all your iCloud data on your iOS device from within your apps (contacts in the contacts app, calendars in the calendar app, email in the mail app, etc.).

  • Setting up a mail client and the outgoing SMTP set...

    If you switch to BT and you already have email accounts set up in a mail client such as Mac Mail, you will not find any help from BT to set up correct outgoing SMTP mail server settings. Most other ISP's are usually very helpful with this but BT have a policy that is designed to make things very difficult to use anything but a BT email account. They claim that it is "To protect our email server from abuse by spammers". However, the solution to this has potentially quite the opposite effect as you will not be able to enable SSL for it to work!!
    OK so the first thing to do is create a new 'MY BT' account and then go to 'email' and create a new BT email account. Keep a note of address and password as you will need this later.
    Go back to the main BT home page and click on mail. You should land on a page that has the new 'BT Mail Account' in the left column and above a purple tool bar with 'Settings at the end. Click on 'Settings'.
    In the left column, underneath 'Mail' click on 'Accounts'
    Click 'Add' and fill in the details of your 3rd party (Non BT) account. They encourage IMAP only but I used 'POP" with no problems as I have no intention of using the BT mail client.
    In 'Mail settings' fill out the mail server details from your email provider or where your site is actually hosted. For example if you are using 000webhost.com then they will advise 'mx.000webhost.com' or may be 'mail.yourwebsitename' 
    The correct port to enter is the port listed in your web client. For me with Mac Mail then this is listed in preferences and if you check this then also double check that use SSL is NOT checked
    Enter the email user name and password of your 3rd party account
    Make sure in 'Security' that 'Requires SSL' is NOT checked
    Then save the account
    Click on the 'Email' link in the purple tool bar and there now under the 'BT Mail Account' should be your 3rd party account. Click on it and it should now load your email. If it doesn't then you will have to try again the steps above as there is no point in going any further if the BT client cannot access your account.
    OK if it is working then go to your email client (Mac Mail in my case) and in preference set up your 3rd party account as follows.
    Incoming mail server is as the info from your host ( like as in 'mx.000webhost.com' above). Enter user name and password for the 3rd party account.
    Now set up an Outgoing mail server (SMPT). I just called it BT with the server name 'mail.btinternet.com'. In the advance section click use default ports. It actually uses 465 if you do not have a default option.
    Do NOT enable SSL
    Set Authentication to 'Password'
    Then in user name enter the details of the NEW BT account you opened for user name and password.
    And that's it. Well it works for me! I got no assistance whatsoever from BT in trying to solve this. I asked to cancel my new BT infinity service after only 5 days when it proved so difficult to get an email client like mac mail to work. I had 4 calls from them and even the 'Tech Experts' had no idea what I was talking to them about!

    Melski wrote:
    Ah yes Thank you Oliver that is a better solution. However all the settings in the BT client have to remain as no SSL. If you change to use SSL there you will not be able access the 3rd party email account.
    Your email client cannot handle SSL for SMTP (BT) and non-SSL for POP3 (000webhost)?
    Oliver.

  • XCelsius and Crystal Report data connectivity

    Hello,
    I have an Excelsius SWF-file and a Crystal Report in which I want to show the SWF. The data of the SWF should depend on a cross table in Crystal Report (event type/month. I tried but have several difficulties:
    1. there are different counts of event types in the different customer databases so I can not assign a fix data area in XCelsius
    2. I have sorted the columns in the cross table in CR according to the number of the month, but the name of the month is showing. When I choose the option "use an existing cross table" the SWF in the design form only shows the numbers and not the names like in the cross table of  CR
    3. in the preview no SWF is shown
    4. how to bind cross table(s) AND single values to CR
    Can anybody help or can suggest an exact documentation of the connection of the both programms with an example?
    Thanks in advance for help.
    Monika Anna

    Hi Ingo,
    Yes, I understand what you 're saying:
    "for distributing report objects with a larger audience you need a BusinessObjects server environment - either BusinessObjects Edge or BusinessObjects Enterprise."
    What I dont understand is WHY we need to have BusinessObject Enterprise to distribute the crystal report so that my user can access the crystal report?
    The reason why I am confused because earlier on, you mentioned we (BW users) can call the URL and access crystal report is saved in the BW Role.
    Since my BW users can access crystal report directly from BW, WHY do we still need BusinessObject Enterprise?
    Please advise, thank you.

  • How do I reset responses (and summary report data) and use my online form for a new application?

    I created my online form, distributed it and captured the responses - worked well.  Now I want to use the same form and start over (track the responses separately from the first time). How do I reset the results / responses? Thanks for any help you provide. 

    Go to the My Forms tab on the main dashboard (were all of your forms are listed). Select your form and hit the duplicate button. This will make a copy of your form and you have the option to include or not include the responses in the copy.
    Randy

Maybe you are looking for