Recive Reporting Data in LMS

Hi.
I'm developing a small LMS that would mostly use Adobe Captivate 3 and Adobe Flash files
I would like to know what is the output of the E-mail Reporting System (what are the names of the data/body strings?)
to be able request the data for my LMS database?
and 2 more short questions:
- Can I recive only the % of the Sile View or Complete/Incomplete, in case i'm no useing any interaction?
- Can I define in captivate which Var to export, or should I deal with Predefine formats?
thank you
Yakov Pinchasov

Hi 981222,
have you checked WriteBack functionality?
Have a look on it (Configuring and Managing Analyses and Dashboards - 11g Release 1 (11.1.1))
Eldar.A

Similar Messages

  • Problem restoring LMS 2.6 data to LMS 4.0

    Hello,
    I'm trying to migrate from LMS 2.6 on a Windows 2003 server to LMS 4.0 on Windows 2008 R2.
    Two things I'll note here which might help future people attempting to use the LMS 4.0 backup/restore scripts as detailed in the migration document:
    1) Make sure nobody has installed cygwin on the server before you run the backup scripts. If they have, remove it and the reg keys.
    2) On the target server make sure to run the restore script CMD session using runas or disable UAC on Windows 2008 or the restore scripts will fail
    My next problem I can't get past. I've been running the restore since yesterday morning and it still hasn't finished. It hasn't moved from 40% complete migrating the IPM data but in the last 24 hours it has eaten up 6Gb more RAM.
    C:\Users\zadmkovesa>e:\sw\CSCOpx\bin\perl.exe e:\Sw\CSCOpx\bin\restorebackup.pl
    -d e:\lmsbackup\ -t e:\lmsbackup\temp
    Restore started at : 2011/03/15 11:21:12
    Please see 'E:\Sw\CSCOpx\log\restorebackup.log' for status.
         USER ID is ..................................... : zadmkovesa
         OS of the backup archive is..................... : Windows
         WARNING: The temp directory e:\lmsbackup\temp\tempBackupData is not empty.
                  An earlier instance of restore operation might have been interrupt
    ed before completion.
                  If you proceed with the restore operation, this directory and its
    sub directories will be deleted.
                  Do you want to continue?  (y-continue or n-quit, y/n)?y
         Generation to be restored is ................... : 0
         Backup taken from............................... : e:\lmsbackup\
         Common Services version in the backup data is... : 3.0.6
         Common Services is installed in................. : E:\Sw\CSCOpx
         The temp folder for this restore program........ : e:\lmsbackup\temp\tempBa
    ckupData
         Applications installed on this machine ......... : [Common Services][Campus
    Manager][Resource Manager Essentials][Device Fault Manager][cvw][cwlms][cwporta
    l][ipm][upm]
         Applications in the backup archive ............. : [Common Services][Campus
    Manager][Resource Manager Essentials][Device Fault Manager][cvw][ipm]
         WARNING: The list of applications installed on this CiscoWorks server does
    not match the list of
                  applications in the backup archive. If you restore data from this
    backup archive,it may
                  cause problems in the CiscoWorks applications.
                  Do you want to continue the restore operation?  (y-continue or n-q
    uit, y/n)?y
         Applications to be restored are................. : [Common Services] [Campu
    s Manager] [Resource Manager Essentials] [Device Fault Manager] [cvw] [ipm]
         Available disk space in NMSROOT................. : 237023900 Kb
         Required disk space in NMSROOT.................. : 38857544 Kb
         (The temp and NMSROOT are on same device, therefore this required disk spac
    e includes temp space)
      Copying the backup files to the temporary location [e:\lmsbackup\temp\tempBack
    upData]
      preRestore of [Common Services] has started.
      preRestore of [Common Services] has completed.
      preRestore of [Campus Manager] has started.
      preRestore of [Campus Manager] has completed.
      preRestore of [Resource Manager Essentials] has started.
      preRestore of [Resource Manager Essentials] has completed.
      preRestore of [Device Fault Manager] has started.
      preRestore of [Device Fault Manager] has completed.
      preRestore of [cvw] has started.
      preRestore of [cvw] has completed.
      preRestore of [ipm] has started.
    Not all data from IPM 2.6 will be migrated to LMS 4.0. For information on the d
    ata that is migrated, see the "Migrating Data From LMS 2.6 or 2.6 SP1" section i
    n the "Migrating Data to CiscoWorks LAN Management Solution 4.0" chapter in the
    LMS4.0 Installation Guide for details. Do you want to continue[Y/N]:[Y]y
    Do you want the IPM 2.6 collectors present in the backup to be DELETED from the
    source routers [Y/N]:[Y]n
      preRestore of [ipm] has completed.
      doRestore of [Common Services] has started.
              License check started.
                 WARNING: The license details in the server are different from the b
    ackup data.
                          After restoring, please check the license available in the
    server.
                 WARNING: Your current license count is lower than your earlier lice
    nse count.
                          If you restore the data now, devices that exceed the curre
    nt licence count
                          will be moved to Suspended state.
              License check completed.
              Restoring certificate.
                 WARNING: Cannot evaluate the hostname, hence the certificate
                          may be from this host or another host.
                          [  Certificate not overwritten  ]
              Restored Certificate.
              Restoring Common Services database.
              Restored Common Services database.
              Restoring CMIC data.
              Restored CMIC data.
              Restoring CMC data.
              Restored CMC data.
              Restoring Security Settings.
                    INFO:
                    Backup Mode is ACS. In LMS we are not supporing ACS for AAA.
                    Setting Login Mode to CiscoWorks Local.
              Restored Security Settings.
              Restoring DCR data.
              Restored DCR data.
              Restoring Certificate key store.
              Restored Certificate key store.
              Restoring JAAS configuration.
              Restored JAAS configuration.
              JRM Job Migration started.
              JRM job Migration done.
      doRestore of [Common Services] has completed.
      doRestore of [Campus Manager] has started.
      doRestore of [Campus Manager] has completed.
      doRestore of [Resource Manager Essentials] has started.
    10% of RME  Restore completed
    30% of RME  Restore completed
    50% of RME  Restore completed
    70% of RME  Restore completed
    100% of RME  Restore completed
      doRestore of [Resource Manager Essentials] has completed.
      doRestore of [Device Fault Manager] has started.
    10% of DFM Restore completed
    30% of DFM Restore completed
    50% of DFM Restore completed
    80% of DFM Restore completed
    Going to modify Eight PM report
    Modified Sucessfully Eight PM report
    100% of DFM Restore completed
      doRestore of [Device Fault Manager] has completed.
      doRestore of [cvw] has started.
      doRestore of [cvw] has completed.
      doRestore of [ipm] has started.
    10% of IPM migration completed
    20% of IPM migration completed
    30% of IPM migration completed
    40% of IPM migration completed
    >>>
    The restore log is more than 8Mb in size so I won't upload it unless someone really wants to see it. Here is a snip from the end of it showing no clues I can make sense of:
    INFO: Getting message                                                                                                    
    INFO: Connect the database dsn=ipmdb
    INFO: Connected the Database
    INFO: Command Executed
    INFO: Connecting the Database ipmdb
    INFO: Company=Cisco Systems;Application=NMTG;Signature=010fa55157edb8e14d818eb4fe3db41447146f1571g32125eb777a87cbf8b29a954f559d4221b792ff8
    INFO: Preparing AUTH cmd
    INFO: AUTH Executed
    INFO: AUTH cmd finished
    INFO: Stopping the Database engine ipmdb
    INFO: E:\Sw\CSCOpx/objects/db/conf/ConfigureDB.LOCK released for the future operations...
    INFO: File not exists.[Tue Mar 15 13:17:30 2011] Successfully registered the database with DM
    [Tue Mar 15 13:17:30 2011] Successfully configured IPMDB database !..
    10% of IPM migration completed
    INFO: Start changing password for database 'ipm'...
        Starting database engine ipmEng
    INFO: New userinfo updated into database successfully.
        Stopping database engine ipmEng
    SQL Anywhere Command File Hiding Utility Version 10.0.1.4051
    INFO: Password changed in odbc template file for database 'ipm'.
    INFO: Password changed for ODBC registry 'ipm'.
    INFO: Password changed for Property file E:\Sw\CSCOpx\lib\classpath\com\cisco\nm\cmf\dbservice2\DBServer.properties.
    INFO: Database password successfully changed.
    INFO: Process created
    INFO: Started the Database engine : ipmEng Retry 0
    INFO: Started the Database engine : ipmEng Retry 1
    INFO: Started the Database engine : ipmEng Retry 2
    INFO: Started the Database engine : ipmEng Retry 3
    INFO: Started the Database engine : ipmEng Retry 4
    INFO: Getting message                                                                                                    
    INFO: Connect the database DSN=ipm
    INFO: Connected the Database
    INFO: Command Executed
    INFO: File not exists.[Tue Mar 15 13:18:08 2011] Updated IPM 2.6 DB password into IPM 5.0 DB
    20% of IPM migration completed
    The CiscoWorks Daemon Manager service is starting.
    The CiscoWorks Daemon Manager service was started successfully.
    [Tue Mar 15 13:18:12 2011] Waiting for all daemons to start completely...
    [Tue Mar 15 13:20:12 2011] Successfully stated daemon manager
    30% of IPM migration completed
    [Tue Mar 15 13:20:12 2011] Checking if IPM Process is up and running
    INFO:Daemon Manager is processing other tasks in the queue.Please try again later.
    [Tue Mar 15 13:25:12 2011] Process IPMProcess is not running or never started
    [Tue Mar 15 13:25:12 2011] Will check after 60 seconds
    INFO:Daemon Manager is processing other tasks in the queue.Please try again later.
    [Tue Mar 15 13:31:12 2011] Process IPMProcess is not running or never started
    [Tue Mar 15 13:31:12 2011] Will check after 60 seconds
    [Tue Mar 15 13:34:54 2011] Process IPMProcess is running normally
    [Tue Mar 15 13:34:54 2011] Let's proceed further..
    40% of IPM migration completed
    [Tue Mar 15 13:39:54 2011]  delete flag is  0
    [Tue Mar 15 13:39:54 2011]  Data passed to MigrMain : 2.6 5.0 E:\Sw\CSCOpx E:\Sw\CSCOpx e:\lmsbackup\temp\tempBackupData Migration e:\lmsbackup\temp\tempBackupData 0
    Should I just wait? Is this something that might take days if I've got a lot of IPM data or is it broken? I don't really care about keeping IPM data so could I do something to complete the data migration for everything else? Any advice appreciated.
    regards,
    Aaron

    The restore can take a long time if you have a lot of data.  The problem is the database needs to be completely unloaded and reloaded to convert from ASA 9 to ASA 10.  However, if you do not care about IPM data, and you still have LMS 2.6 running, you can take a new LMS backup not using the wrapper.pl script (this will keep IPM from being backed up).

  • Migrating DFM data from LMS 3.1 to LMS 4.2.3

    Hello, I need to migrate DFM alarm settings data from LMS 3.1 to LMS 4.2.3 and I want to use this method, http://www.cisco.com/en/US/docs/net_mgmt/ciscoworks_device_fault_manager/3.1/user/guide/useAAD.html#wp1433848 , to extract the data from 3.1 and then inport it into 4.2.3.
    I successfully performed it for IP settings, it was easy since the data format was the same.
    But the format differs quite alot for Interface and Port data, here is an example:
    export from LMS 3.1
    IF-hostname/17 [Gi0/0.524] [10.55.254.3]; INTERFACE:;IF-hostname/17; MANAGED_STATE:;EXPLICITLY_UNMANAGED
    export from LMS 4.2.3
    INTERFACE:IF-hostname/17 MANAGED_STATE:MANAGED GigabitEthernet0/0.524
    It looks like I have convert interface names, sort and delete stuff to make it look the same.
    Have anyone done this before and can give me some advice?
    BR /Crille

    Just want to report back that I found an answer to this, it is enough to import this string:
    INTERFACE:IF-hostname/17 MANAGED_STATE:MANAGED (or EXPLICITLY_UNMANAGED).
    So what i did was use Notepad++ and regexp to filter out the needed data:
    Search for: ^.+(INTERFACE.+$)
    Replace with: \1
    and then another replace for ";" with nothing.
    Then I ended up with:
    INTERFACE:IF-hostname/17 MANAGED_STATE:EXPLICITLY_UNMANAGED
    which could be imported with the mentioned scripts in my original post.

  • Quiz score not reporting to the LMS

    Initially the quizzes 'held' the previous answers so the user could not retake them.  (Infinite attempts is selected.)  I checked off Never Send Resume Data and now the quiz scores aren't reporting to the LMS.  Any other suggestions of setting changes I should have done or could now try?

    So you want the user to be able to take the quiz as many times as they want?  I'm not exactly sure how you want it to act.
    Do you have "allow backward movement selected in Quiz Settings?  That should allow the user to repeat quiz questions and I think you should NOT have a check in the Never Send Resume data.
    I'm no expert, that is for sure, just figure things out as I go.  I would try those things and see how it works.

  • Reporting info to LMS?

    Hi,
    Can someone tell me all about or point me to where I can get the info on how and what data gets "reported" to an LMS?
    I don't see anywhere that I can pass a variable. I also what to know when the info gets set.  I am assuming it is all gathered and sent at the end of a session so that if the user changes his mind and unclicks and reclicks stuff, only the final answer is sent.  I
    Is there detailed documentation somewhere on this?
    Thanks much!
      Lori

    What gets reported completely depends on what you select in the Quiz Preferences.
    SCORM identifies specific things to track, i.e. 'lesson_location' to record the page where the lesson was exited, 'suspend_data' for any random information that needs to be tracked, 'score' for a quiz percentage (usually), 'status' for passed/failed or completed/incomplete, 'interaction' and 'objective' (in arrays) to record specific interaction/objective data....and so on.
    So what you can store has to match with those pre-defined values. You can't just say, for example, UserName == "Jane Doe" and tell the LMS to store UserName...
    'suspend_data' is mostly the only place you can store random data, but CP does not provide a way for you (that I know of) to put random information into that field.
    I've not explored when data is sent in CP4, but in CP3 it seemed to be sent on a timed-interval, every 7 seconds more or less. In some situations, it may be sent every time a page is advanced (which is the proper way,  IMO), or it may be stored in internal JS variables until the very end...
    Either way would be fine in the scenario you propose. Whether the data is sent after every page or all at the end, both would allow a user to go back and change their answers, and have those updated to the LMS....*if* you setup the quiz to allow users to change their answers....
    HTH
    Erik

  • Stock Report Date Wise

    Dear All
            I am devlop new Stock Report Date wise
    my output follows
    Material    Plant   Storagelocaton   Saleorder OPenning stock     Closing Stock     Total
    Which Table stored that details?
    how will calculated?
    Thanks and Regards
    Suresh

    hey
    i think the following tables
    will be helpful for the report:-
    1>MARA
    2>MBEWH
    3>MSEG
    4>VBRP
    hope this will help
    good luck
    harry.

  • Error while creating a new report data source in Sharepoint Document library

    I installed SSRS(sql server 2012 SP1) in my sharepoint 2013 farm(single server installation).
    I have installed SSRS addin for report also. 
    I created a document library and included the report data source content type, report builder report and report builder model content types. On click of any of these content types i get the below error. 
    "new Document requires a Microsoft SharePoint Foundation-compatible application and web browser. To add document to document library, click on the 'Upload' button."
    I am using windows 7 64-bit operation system with Google chrome and IE 8 64 bit.
    Any help would be appreciated. 

    Rakesh, is this an SSRS question?
    This doesn't look like Power View.
    Thanks!
    Ed Price, Azure & Power BI Customer Program Manager (Blog,
    Small Basic,
    Wiki Ninjas,
    Wiki)
    Answer an interesting question?
    Create a wiki article about it!

  • Getting 401 error while creating a Report Data Source with MOSS 2010 Foundation

    I have setup SQL Server 2008 R2 Reporting Services with SharePoint 2010 Foundation in SharePoint integrated mode. SharePoint Foundation is in machine 1 whereas SQL Server 2008 R2 and SSRS Report Server are in machine 2. While configuring Reporting
    Services - Sharepoint integration, I have used Authentication Mode as "Windows Authentication" (I need to use Kerberos).
    My objective is to setup a Data Connection Library, a Report Model Library, and a Reports Library so that I can upload a Report Data Source, some SMDLs, and a few Reports onto their respective libraries.
    While creating the top level site, "Business Intelligence Center" was not available for template selection since SharePoint Foundation is being used. I therefore selected "Blank Site" as the template.
    While creating a SharePoint Site under the top level site, for template selection I again had to select "Blank Site".
    I then proceeded to create a library for the data connection. Towards this, I created a new document library and selected "Basic page" as the document template. I then went to Library Settings for this newly created library and clicked on
    Advanced Settings. In the Advanced Settings page, for "Allow management of content types?" I selected "Yes". Then I clicked on "Add from existing content types" and selected "Report Data Source". I deleted the existing
    "Document" content type for this library.
    Now I wanted to created a Data Connection in the above Data Connection library. For this when I clicked on "New Document" under "Documents" of "Library Tools" and selected "Report Data Source", I got the error "The
    request failed with HTTP status
    401: Unauthorized.".
    Can anybody tell me why I am getting this error?
    Note: I have created the site and the library using SharePoint Admin account.

    Hi,
    Thank you for your detailed description. According to the description, I noticed that the report server was not part of the
    SharePoint farm. Add the report server to the
    SharePoint farm and see how it works.
    To join a report server to a SharePoint farm, the report server must be installed on a computer that has an instance of a SharePoint product or technology. You can install the report server before or after installing the SharePoint product
    or technology instance.
    More information, see
    http://msdn.microsoft.com/en-us/library/bb283190.aspx
    Thanks.
    Tracy Cai
    TechNet Community Support

  • UCCX 7.0.1SR5 to 8.0 upgrade while also adding LDAP integration for CUCM - what happens to agents and Historical Reporting data?

    Current State:
    •    I have a customer running CUCM 6.1 and UCCX 7.01SR5.  Currently their CUCM is *NOT* LDAP integrated and using local accounts only.  UCCX is AXL integrated to CUCM as usual and is pulling users from CUCM and using CUCM for login validation for CAD.
    •    The local user accounts in CUCM currently match the naming format in active directory (John Smith in CUCM is jsmith and John Smith is jsmith in AD)
    Goal:
    •    Upgrade software versions and migrate to new hardware for UCCX
    •    LDAP integrate the CUCM users
    Desired Future State and Proposed Upgrade Method
    Using the UCCX Pre Upgrade Tool (PUT), backup the current UCCX 7.01 server. 
    Then during a weekend maintenance window……
    •    Upgrade the CUCM cluster from 6.1 to 8.0 in 2 step process
    •    Integrate the CUCM cluster to corporate active directory (LDAP) - sync the same users that were present before, associate with physical phones, select the same ACD/UCCX line under the users settings as before
    •    Then build UCCX 8.0 server on new hardware and stop at the initial setup stage
    •    Restore the data from the UCCX PUT tool
    •    Continue setup per documentation
    At this point does UCCX see these agents as the same as they were before?
    Is the historical reporting data the same with regards to agent John Smith (local CUCM user) from last week and agent John Smith (LDAP imported CUCM user) from this week ?
    I have the feeling that UCCX will see the agents as different almost as if there is a unique identifier that's used in addition to the simple user name.
    We can simplify this question along these lines
    Starting at the beginning with CUCM 6.1 (local users) and UCCX 7.01.  Let's say the customer decided to LDAP integrate the CUCM users and not upgrade any software. 
    If I follow the same steps with re-associating the users to devices and selecting the ACD/UCCX extension, what happens? 
    I would guess that UCCX would see all the users it knew about get deleted (making them inactive agents) and the see a whole group of new agents get created.
    What would historical reporting show in this case?  A set of old agents and a set of new agents treated differently?
    Has anyone run into this before?
    Is my goal possible while keeping the agent configuration and HR data as it was before?

    I was doing some more research looking at the DB schema for UCCX 8.
    Looking at the Resource table in UCCX, it looks like there is primary key that represents each user.
    My question, is this key replicated from CUCM or created locally when the user is imported into UCCX?
    How does UCCX determine if user account jsmith in CUCM, when it’s a local account, is different than user account jsmith in CUCM that is LDAP imported?
    Would it be possible (with TAC's help most likely) to edit this field back to the previous values so that AQM and historical reporting would think the user accounts are the same?
    Database table name: Resource
    The Unified CCX system creates a new record in the Resource table when the Unified CCX system retrieves agent information from the Unified CM.
    A Resource record contains information about the resource (agent). One such record exists for each active and inactive resource. When a resource is deleted, the old record is flagged as inactive; when a resource is updated, a new record is created and the old one is flagged as inactive.

  • Multiple data sets: a common global dataset and per/report data sets

    Is there a way to have a common dataset included in an actual report data set?
    Case:
    For one project I have about 70 different letters, each letter being a report in Bi Publisher, each one of them having its own dataset(s).
    However all of these letters share a common standardized reference block (e.g. the user, his email address, his phone number, etc), this common reference block comes from a common dataset.
    The layout of the reference block is done by including a sub-llayout (rtf-file).
    The SQL query for getting the dataset of the reference block is always the same, and, for now, is included in each of the 70 reports.
    Ths makes maintenance of this reference block very hard, because each of the 70 reports must be adapted when changes to the reference block/dataset are made.
    Is there a better way to handle this? Can I include a shared dataset that I would define and maintain only once, in each single report definition?

    Hi,
    The use of the subtemplate for the centrally managed layout, is ok.
    However I would like to be able to do the same thing for the datasets in the reports:
    one centrally managed data set (definition) for the common dataset, which is dynamic!, and in our case, a rather complex query
    and
    datasets defined on a per report basis
    It would be nice if we could do a kind of 'include dataset from another report' when defining the datasets for a report.
    Of course, this included dataset is executed within each individual report.
    This possibility would make the maintenance of this one central query easier than when we have to maintain this query in each of the 70 reports over and over again.

  • Loading complex report data into a direct update DSO using APD

    Dear All,
    Recently, I had a requirement to download the report data into a direct update DSO using an APD. I was able to perform this easily when the report was simple i.e it has few rows and columns. But I faced problems If the report is a complex one. Summing up, I would like to know how to handle the scenarios in each of the following cases:
    1.   How should I decide the key fields and data fields of the direct update DSO ? Is it that the elements in ROWS will go to the
          key fields of DSO and the remaining to the data fields? Correct me.
    2.   What if the report contains the Restricted KFs and Calculated KFs? Do I have to create separate infoobjects in the BI
          system and then include these in the DSO data fields to accommodate the extracted data ?
    3.   How do I handle the Free Characteristics and Filters ?
    4.  Moreover, I observed that if the report contains selection screen variables, then I need to create variants in the report and
         use that variant in the APD. So, if I have 10 sets of users executing the same report with different selection conditions, then
         shall I need to create 10 different variants and pass those into 10 different APDs, all created for the same report ?
    I would appreciate if someone can answer my questions clearly.
    Regards,
    D. Srinivas Rao

    Hi ,
    PFB the answers.
    1. How should I decide the key fields and data fields of the direct update DSO ? Is it that the elements in ROWS will go to the
    key fields of DSO and the remaining to the data fields? Correct me.
    --- Yes , you can use the elements in the ROWS in the Key fields,  but in case you get two records with same value in the ROWS element the data load will fail. So you basically need to have one value that would be different for each record.
    2. What if the report contains the Restricted KFs and Calculated KFs? Do I have to create separate infoobjects in the BI
    system and then include these in the DSO data fields to accommodate the extracted data ?
    Yes you would need to create new Infoobjects for the CKF's and RKF's in the Report and include them in your DSO.
    3. How do I handle the Free Characteristics and Filters ?
    The default filters work in the same way as when you yourself execute the reoprt. But you cannot use the Free characterisitics in the APD. only the ROWS and cloumns element which are in default layout can be used.
    4. Moreover, I observed that if the report contains selection screen variables, then I need to create variants in the report and
    use that variant in the APD. So, if I have 10 sets of users executing the same report with different selection conditions, then
    shall I need to create 10 different variants and pass those into 10 different APDs, all created for the same report ?
    --- Yes you would need to create 10 different APD's. Its very simple to create, you can copy an APD. but it would be for sure a maintance issue. you would have to maintain 10 APD's.
    Please revert in case of any further queries.

  • PowerPivot report data refresh error Data source as SharePoint list Data Feed

    Hi All,
    I am facing a problem on auto refresh, the report data source is SharePoint list Data Feed, I saved it my PC desktop when I set up auto fresh it fails then I moved the SharePoint list data feed "atomsvc" file to A SharePoint Data feed library then connected
    to report started a manual refresh it was success, after next schedules it failed.
    Error massage:
    Errors in the high-level relational engine. The following exception occurred while the managed IDbConnection interface was being used: 
    The network path was not found. ;The network path was not found. The network path was not found. . 
    A connection could not be made to the data source with the DataSourceID of '82f49f39-45aa-4d61-a15c-90ebbc5656d', 
    Name of 'DataFeed testingreport'. An error occurred while processing the 'Testing Report' table. The operation has been cancelled.  
    Thank you very much !
    erkindunya

    If you are using Claims Auth then this is a known limitation.  Auto refresh only works under Classic Auth.
    I trust that answers your question...
    Thanks
    C
    http://www.cjvandyk.com/blog |
    LinkedIn | Facebook |
    Twitter | Quix Utilities for SharePoint |
    Codeplex

  • Report data binding error

    I have created a banded report split into departments. Each
    recore has a value associated with it. The report runs fine if I
    dont try to sub-total each departments vale, but if I add a
    calculated field to the banding, I get the following error:
    Report data binding error Error evaluating expression :
    textField_2 Source text : calc.Department_Total.
    Variable calc.Department_Total is undefined.
    The calculated field is simply the sum of the values, with an
    initial value of 0 and set to reset when the group changes on the
    department. I am using the same data type for the calc field as it
    automatically gave for the original Value field (Big Decimal)
    Any ideas?
    Dave H

    Does anyone have any ideas about this, Its getting a bit
    critical now. Has anyone else been able to do sums that calculate
    on group changes?? The sum total works for the report, jusyt not
    the bands. I desparate here, pulling my hair out.
    Regards
    Dave H

  • Xcelsius - Crystal Report Data Consumer connection Issues

    Sorry for the long post but i hope with the full explaination, i can get a quick answer & solution.
    Using Crystal Reports 2008 and Xcelsius 2008 Engage, SQL 2008 stored procedure.
    I have tested the Crystal Reports Data Consumer connection within the Xcelsius program, using the following steps :
    1. Create a stored procedure that returns the Continent Name and the Count of Projects within each Continent.
    2. Create a Crystal Report using the stored procedure, 2 columns in the Details section, column names in the Page Header section.
    3. Save and export this as an Excel spreadsheet.
    4. Open the Xcelsius program, new file and import spreadsheet. Column A is the Continent Name, Column B is the Count.
    5. Without altering the spreadsheet, I created a graph using the data imbedded. i.Bar Graph, series and row selections. Ii. Pie chart with row data selection. Iii.Column data as series and row selections.
    6. Created a Crystal Report Data Consumer connection using the data imbedded within then spreadsheet. The column A as the u201CRow Headersu201D & column B as the u201CDatau201D. Took quite a while to get the right combination of selected components so that the preview actually showed the data and the corresponding data labels. The Legend still hasnu2019t shown up.
    7. Save and Export as Flash (swf) file.
    8. Close Xcelsius and open Crystal Reports.
    9. Open the same report used to create the spreadsheet and Insert -> Flash, choosing the newly created swf file.
    10. Link the data displayed On the Crystal Report to the Flash file now embedded, using the Flash Expert. Have tried both with the data listed on the Report, and the data listed as the result of the stored procedure.
    11. Previewed and the Data shown on the Graph does match the data listed within the Details section of the Report.
    12. I then altered the stored procedure to add u201C1u201D to all counts, and refreshed the Crystal Report. (I cannot add new data to the underlying database/tables, due to various other folks & projects using that same database.)
    13. The listed data does change to match the changes within the procedure.
    14. The Data labels on the Graph do change to match the changes within the procedure.
    15. The Actual Displayed Data within the Graph Does Not change to match the changes within the procedure.
    I have attempted this with a pie chart, a bar and column charts with the same affect.
    I have searched the web, printed out 4 different examples on how to make this work and still it is hit or miss. I still can not get the legend to show up, either in the preview or the within the Crystal Reports. When I preview the graph before adding the Crystal Report Data Consumer connections, it does work.
    Please someone tell me, What am I missing here ? 
    Thanks in advance for your help.

    http://www.****************/Tutorials/BI/Xcelsius/Index.htm
    http://www.resultdata.com/Company/News/Newsletter/2008/aug/Articles/Xcelsius/Using%20the%20Xcelsius%20Crystal%20Report%20Data%20Consumer.htm
    http://www.sdn.sap.com/irj/scn/go/portal/prtroot/docs/library/uuid/10161e25-c158-2c10-3086-ad502098b36b?QuickLink=index&overridelayout=true
    http://codesforprogrammers.blogspot.com/search/label/How%20to%20suppress%20blank%20row(s)%20in%20detail%20section%20of%20the%20report%3F
    http://www.businessobjects.com/pdf/product/catalog/crystalreports/cr_xc_integration.pdf

  • SAP Crystal Reports data source connection problem using sap business one

    Hi,
    I m facing a problem regarding: SAP Crystal Reports data source connection problem using sap business one
    I am trying to create a Crystal report but when I try to configure a new connection it does not work.
    I select Sap Business One data source and try to complete the information required to connection but it does not list my companies databases, what is the problem?
    Our Current SAP related software details are as follows:
    OS: Windows Server 2008
    SAP B1 Version: SAP B1 9 (902001) Patch 9
    SAP Crystal Report Version: 14.0.4.738 RTM
    Database: MS SQL Server 2008 R2
    I have also added some screenshots of the issues.
    Please have a look and let me know if you have any questions or any further clarifications.
    I m eagerly waiting for a quick and positive reply.

    Hi,
    There is problem with SAP Business One date source.
    I had faced same problem, I used OLEDB Data-source, and it worked fine for me.
    So, try to use OLEDB.
    Regards,
    Amrut Sabnis.

Maybe you are looking for

  • How do I keep just the round union logo?

    For quite a while I tried to include my union logo in my writings, advertising but the copy I have is surrounded by a square I think you call it canvas. I can't figure out how to get rid of that in Elements. I made it invisible but it is still there

  • HELP! Photo's not showing up in 3rd Party app media browsers

    I just reformatted my computer. I restored all my docs (iPhoto library included) by dragging my Home folder from the external hard drive to the newly reformatted computer. I have done this many, many times. This time however, something got messed up

  • Survey is not available for this language

    Hi, Can someone please advise what needs to be done to have surveys displayed? I'm able to see the survey attached to the activity in Online, but in MSA it always comes up with this message. The surveys are maintained in EN, and I'm logged on to MSA

  • I18N- Problems in Internationalization of numbers

    I am trying to internationalize numbers. Question 1: In the code when I try to parse "1,2,3,4,5.67" in US locale it is not throwing parse exception even though i have invalid position of group separators(,) in this String. How do we handle this? Ques

  • Data Insertion, XML Schema Specifying Element minOccurs="1

    An XML schema having an element with attribute minOccurs="1" has been successfully registered. A table was created based on that schema. Inserting data both with and without that element works successfully. Why is this?