Do We Need to Validate Data Before Loading Into Planning?

We are debating between whether to load data from GL to Planning using ODI or FDM. If we need some form of validity check on the data, we will have to use FDM, otherwise I believe ODI is good enough.
My question is, for financials planning, what determines whether we need validity checks or not? How do we decide that?

FDM helps in validation for data load audit options but validation is as easy as doing a comparison to totals by GL accounts from source and planning. You should be able to use ODI, FDM or load rules to load data into Hyperion and complete validation outside using any of reporting options.

Similar Messages

  • Depersonalising Data Before loading into DB

    Hi Guys,
    I need some help on de-Personalizing customer data before loading it into the database using SSIS.
    So one all the transformation done and finally want to load the data into respective tables , we need to de-personalize it.
    Also, how it will handle datatype of the table for each columns need to be de-personalized ?
    Later on we have to again de-cript once its tested by the testers. 
    Anky

    Hi Raj
    We have to  encrypt the data before loading the data into the table.
    As we are not encrypting the client ID that can be used to join with other tables for testing purpose but tester won’t able to see the other Client Personal Data
    like account number, address and DOB etc .
    we have to decrypt the data back once testing is done.
    Anky

  • UNIX sed commands to clean up data before loading into BI

    Hi all,
    we are trying to load the data into BI from text files.. This data needs to be cleaned before it can be loaded into BI and this has to be done using UNIX sed commands!!
    We basically need to do a data cleansing act, like removing unwanted characters (tab values and newline values as part of the characteristic values). How can it be done using unix sed commands??
    your help is very much appreciated.
    Regards

    Hi all,
    we are trying to load the data into BI from text files.. This data needs to be cleaned before it can be loaded into BI and this has to be done using UNIX sed commands!!
    We basically need to do a data cleansing act, like removing unwanted characters (tab values and newline values as part of the characteristic values). How can it be done using unix sed commands??
    your help is very much appreciated.
    Regards

  • Do I need to uninstall CS5 before loading CS6?

    Do I need to uninstall CS5 before loading CS6? I purchased the full version of CS6.

    Hi pamelakirkpatrick ,
    Welcome to Adobe Forums.
    You dont need to uninstall existing version to install new version.
    You might consider wiping the existing version if you no longer use it as storing multiple versions can occupy a lot of hard disk space.
    Please reply for any assistance.
    Thanks

  • ODI : how to raise cross reference error before loading into Essbase?

    Hi John .. if you read my post, I want to say that you impress me! really, thank for your blog.
    Today, my problem is :
    - I received a bad quality data file from ERP extract
    - I have cross reference table (Source ==> Target)
    - >> How to raise the error before loading into Essbase !
    My Idea is the following, (first of all, I'm not sure if it is a good one, and also I meet issue to do it in ODI !)
    - Step 1 : make JOIN between data.txt and cross-reference Table ==> Create a table DATA_STEP1 in the ODISTAGING schema (the columns of DATA_STEP1 are the addition of columns of data.txt those of cross-references Tables (... there is more than 20 columns in my case)
    - Step 2 : Control if there is no NULL value in the Target Column (NULL means that the data.txt file contains value that are not defined in my cross reference Table) by using Filter ( Filter = Target_Account IS NULL or Target_Entity IS NULL or ...)
    The result of this interface is send to reject.txt file - if reject.txt file is not empty then a mail is sent to the administrator
    - Step 3 : make the opposite : Filter NOT (Target_Account IS NULL or Target_Entity IS NULL ... ) ==> the result is sent in DATA_STEP3 Table
    - Step 4 : run properly the mapping : source : DATA_STEP3 (the clean and verified data !) with cross reference Tables and send data into Essbase - NORMALY, there is not rejected record !
    My main problem is : what is the right IKM to send data into the DATA_STEP1, or DATA_STEP3 Table, which are Oracle Table in my ODISTAGING Schema ! I thy with IKM Oracle Incremental Update but I get error, and actually I don't need an update (which is time consumming), I just need an INSERT !
    I'm just lookiing for an 'IKM SQL to Oracle" ....
    regards
    xavier

    Thanks john : very speed !
    I understood better now which IKM is useful.
    I found other information about the error followup with ODI : http://blogs.oracle.com/dataintegration/2009/10/did_you_know_that_odi_generate.html
    and I decided to activate Integrity Constorl in ODI :
    I load :
    - data.txt in ODITEMP.T_DATA
    - transco_account.csv in ODITEMP.T_TRANSCO_ACCOUNT
    - transco_entity.csv in ODITEMP.T_TRANSCO_ENTITY
    - and so on ...
    - Moreover I create integrity constraints between T_DATA and T_TRANSCO_ACCOUNT and T_TRANSCO_ENTITY ... so I expected that ODI will raise for me in E$_DATA (the error table) the bad records !
    However I have one issue when loading data.txt into T_DATA because I have no ID or Primary Key ... I read in a training book that I could use a SEQUENCE ... I try but unsuccessful ... :-(
    Is there another simple way to create a Primary Key automaticaly (T_DATA is in an oracle Schema of course) ?thanks in advance

  • Can we execute the Reporting while the data is loading into that ODS/Cube.

    Hi Friends,
          Can we execute the reports on particular ODS/InfoCube in the following cases
    1) When the data is loading into that ODS/Infocube.
    2) When we are Archiving the data from that ODS/Infocube
    Thanks & Regards,
    Shaliny. M

    Hi Shaliny,
    First of all you are in the wrong forum, in Business Intelligence Old Forum (Read Only Archive) you will find better support.
    In case you are loading data in an infocube you will be able to have report only until the request that has the icon ready for reporting filled. In case of an ODS object i don't think you will be able to have valid reporting since the ODS data firstly needs to be activated.
    Nevertheless please post your question in the above link.
    Kostas

  • How can i add the dimensions and data loading into planning apllications?

    Now please let me know how can i add the dimensions and data loading into planning apllication without manuallly?

    you can use tools like ODI or DIM or HAL to load metadata & data into planning applications.
    The data load can be done at the Essbase end using rules file. But metadata changes should flow from planning to essbase through any of above mentioned tools and also there are many other way to achieve the same.
    - Krish

  • Can transaction data be loade into info object

    Hi Gurus
    Can a Transaction data be loaded into info objects. Appreciate if some one give a simple Definition of transaction data.
    GSR

    Hi,
    You can probably do that but why would you want to do it? Transaction data is generally required for querying purposes and queries run best on multidimensional cube structure, not a flat structure.

  • Do I need to have data before performing Drag and Drop????

    I have gotten drag and drop working with Swing in JDK 1.4b2 using "URL-based" files instead of operating system native files. The way this was accomplished by creating a wrapper class that sub-classed File and would download the contents of the URL to a temporary file and then initiate a normal drag and drop operation using the normal java file mechanisms. However, when you have a large file, the operation can take too long since I am fronting all of the effort at the start of the drag. I would like to be able to delay the need to produce the bytes/files to give to the operating system until after there has been a successful drop, at which point I can do the heavy lifting and raise a dialog telling them that the action is commencing. As best as I can tell, you must have all of the data BEFORE any operation (which could be a potential design flaw!!)
    So, is there a way to get a hook that there has been a successful drop to a vm-external drop target before the VM gives the data over? Even if I create my own data flavor (which isn't well documented outside of Text types) won't I still run into the same problems? Am I just overlooking something simple?
    Thanks for your help.

    Hello
    I've had the same problem, but take a look at: http://foxtrot.sourceforge.net/
    You can use their API to start the long consuming time job (reading files from the network) and also to paint a progress bar, showing the progress in the Event Dispatch Thread.
    This is how I've solved my problem:
    I've passed a MyTransferable object to the startDrag method of the DragGestureEvent event.
    In the getTransferData method of MyTransferable object (that implements Transferable interface) I've used the Worker.post method of their API which reads the file from the network and updates a progress bar.
    The API lets the Event Dispatch Thread enter but not return from the listener method (getTransferData in my case), instead rerouting the Event Dispatch Thread to continue dequeuing events from the Event Queue and processing them (repaint the progress bar). Once the worker thread has finished, the Event Dispatch Thread is rerouted again, returning from the listener method.

  • User exit, badi or enhancement to validate data before saving for  qm02

    Hi,
    Currently, I have a requirement to validate the data of a notification before saving data in Tcode qm02.
    I have found an user exit to check data for tcode qm02. It works for task tab, I mean I can get the data under task tab of qm02 and do the validation. However, for the tab processing: I want to validate the partner name before saving but I can not get the list of partners which is in Processing tab.
    Please let me know if you could show me the user exit, enhancement, or Badi so that I can validate the data under processing tab in qm02.
    Thanks in advance,
    Hung To

    Hi Keshav.T,
    Currently I am using this exit to validate data for the tab "Task". However, for the tab "Processing" I can not get the data, let say Partner, to validate.
    Thanks,
    Hung To

  • Best practice to Consolidate forecast before loading into ASCP

    Hi,
    Can anyone suggest best practice to consolidate forecast that is in spreadsheets. Forecast comes from Sales, Finance and Operations. Then consolidated forecast should be loaded into ASCP.
    Is there any way to automate the load?
    Is Oracle S&OP best product out there?
    Do we need Oracle Demand management also?
    Regards

    Forecast comes from Sales, Finance and Operations (spreadsheets)
    -> Using integration interfaces to load the data in to three different series sales fcst, finance fcst and ops fcst
    Then consolidated forecast should be loaded into ASCP.
    -> create a workflow/ii to load the consolidated of the 3 series
    So this can be done in DM.
    Also a standard workflow exists in S&OP to publish consensus forecast to ASCP which accomplish your objective.

  • Data file load to Planning using FDMEE

    Hi All,
    Hyperion version : 11.1.2.3.0.26
    We have a currency planning application and the dimns are Account,business,entity,currency,version,scenario,period and year.
    My data file contains ;
    Account;business;entity;version;data
    AC_1001;International;US_Region;working;10000
    AC_1002;International;US_Region;working;10000
    When I try loading data to this application using FDMEE I am getting three gold fishes I thought the load is succesful but when I tried retrieving the data from smartview and found the data's are not loaded.
    POV : Jan 15,Actual
    In smartview from Essbase:
    HSP_InputValue
    HSP_InputValue
    Jan
    Jan
    FY15
    FY15
    Actual
    Actual
    Working
    Working
    Local
    USD
    International
    International
    US_Region
    US_Region
    AC_1001
    #Missing
    #Missing
    AC_1002
    #Missing
    #Missing
    Smartview from planning : Adhoc grid cannot be open as there no valid rows of data .
    Not sure why this is happening ,Could you please help me with this . THANKS in ADVANCE!
    Regards,
    Keny Alex

    And this is the log:
    2015-01-29 02:33:35,503 INFO  [AIF]: FDMEE Process Start, Process ID: 621
    2015-01-29 02:33:35,503 INFO  [AIF]: FDMEE Logging Level: 4
    2015-01-29 02:33:35,504 INFO  [AIF]: FDMEE Log File: D:\demos\FDMEE\outbox\logs\RPDPLN_621.log
    2015-01-29 02:33:35,504 INFO  [AIF]: User:admin
    2015-01-29 02:33:35,505 INFO  [AIF]: Location:RPDLOC (Partitionkey:53)
    2015-01-29 02:33:35,505 INFO  [AIF]: Period Name:Jan 15 (Period Key:1/1/15 12:00 AM)
    2015-01-29 02:33:35,506 INFO  [AIF]: Category Name:Actual (Category key:1)
    2015-01-29 02:33:35,506 INFO  [AIF]: Rule Name:RPD (Rule ID:78)
    2015-01-29 02:33:37,162 INFO  [AIF]: Jython Version: 2.5.1 (Release_2_5_1:6813, Sep 26 2009, 13:47:54)
    [Oracle JRockit(R) (Oracle Corporation)]
    2015-01-29 02:33:37,162 INFO  [AIF]: Java Platform: java1.6.0_37
    2015-01-29 02:33:39,399 INFO  [AIF]: -------START IMPORT STEP-------
    2015-01-29 02:33:44,360 INFO  [AIF]: File Name: Datafile.txt
    2015-01-29 02:33:44,736 INFO  [AIF]: ERPI-105011:EPMERPI- Log File Name :D:\demos\FDMEE\outbox\logs\RPDPLN_621.log
    2015-01-29 02:33:44,738 INFO  [AIF]: ERPI-105011:EPMERPI- LOADID:PARTKEY:CATKEY:RULEID:CURRENCYKEY:FILEPATH::621;53:1:78:Local:D:\demos\FDMEE/
    2015-01-29 02:33:44,738 INFO  [AIF]: ERPI-105011:EPMERPI- ImportTextData - Start
    2015-01-29 02:33:44,920 INFO  [AIF]: ERPI-105011:EPMERPI- Log File Name :D:\demos\FDMEE\outbox\logs\RPDPLN_621.log
    2015-01-29 02:33:44,924 INFO  [AIF]: ERPI-105011:EPMERPI- File Name Datafile.txt
    periodKey2015-01-01
    2015-01-29 02:33:44,927 INFO  [AIF]: ERPI-105011:EPMERPI-  PROCESS ID: 621
    PARTITIONKEY: 53
    IMPORT GROUP: RPDVersion11
    FILE TYPE: DELIMITED
    DELIMITER: ;
    SOURCE FILE: Datafile.txt
    PROCESSING CODES:
    BLANK............. Line is blank or empty.
    NN................ Non-Numeric, Amount field contains non numeric characters.
    TC................ Type Conversion, Amount field could not be converted to a number.
    ZP................ Zero Suppress, Amount field contains a 0 value and zero suppress is ON.
    SKIP FIELD.............. SKIP field value was found
    NULL ACCOUNT VALUE.............. Account Field is null
    SKIP FROM SCRIPT.............. Skipped through Script
    Rows Loaded: 2
    Rows Rejected: 0
    2015-01-29 02:33:44,929 INFO  [AIF]: ERPI-105011:EPMERPI- ARCHIVE MODE: null
    2015-01-29 02:33:44,930 INFO  [AIF]: ERPI-105011:EPMERPI- Start archiving file:
    2015-01-29 02:33:44,930 INFO  [AIF]: ERPI-105011:EPMERPI- Archive file name: 62120150101.txt
    2015-01-29 02:33:44,931 INFO  [AIF]: ERPI-105011:EPMERPI- Deleting the source file: Datafile.txt
    2015-01-29 02:33:44,931 INFO  [AIF]: ERPI-105011:EPMERPI- File not deleted: D:\demos\FDMEE\Datafile.txt
    2015-01-29 02:33:44,938 INFO  [AIF]: ERPI-105011:EPMERPI- ImportTextData - End
    2015-01-29 02:33:44,938 INFO  [AIF]: ERPI-105011:EPMERPI- Total time taken for the import in ms = 200
    2015-01-29 02:33:45,069 INFO  [AIF]:
    Import Data from Source for Period 'Jan 15'
    2015-01-29 02:33:45,085 INFO  [AIF]: Generic Data Rows Imported from Source: 2
    2015-01-29 02:33:45,089 INFO  [AIF]: Total Data Rows Imported from Source: 2
    2015-01-29 02:33:45,783 INFO  [AIF]:
    Map Data for Period 'Jan 15'
    2015-01-29 02:33:45,794 INFO  [AIF]:
    Processing Mappings for Column 'ACCOUNT'
    2015-01-29 02:33:45,796 INFO  [AIF]: Data Rows Updated by Rule Mapping '121' (LIKE): 2
    2015-01-29 02:33:45,796 INFO  [AIF]:
    Processing Mappings for Column 'ENTITY'
    2015-01-29 02:33:45,797 INFO  [AIF]: Data Rows Updated by Rule Mapping '121' (LIKE): 2
    2015-01-29 02:33:45,797 INFO  [AIF]:
    Processing Mappings for Column 'UD1'
    2015-01-29 02:33:45,798 INFO  [AIF]: Data Rows Updated by Rule Mapping '121' (LIKE): 2
    2015-01-29 02:33:45,798 INFO  [AIF]:
    Processing Mappings for Column 'UD2'
    2015-01-29 02:33:45,799 INFO  [AIF]: Data Rows Updated by Rule Mapping '121' (LIKE): 2
    2015-01-29 02:33:45,836 INFO  [AIF]:
    Stage Data for Period 'Jan 15'
    2015-01-29 02:33:45,838 INFO  [AIF]: Number of Rows deleted from TDATAMAPSEG: 4
    2015-01-29 02:33:45,848 INFO  [AIF]: Number of Rows inserted into TDATAMAPSEG: 4
    2015-01-29 02:33:45,850 INFO  [AIF]: Number of Rows deleted from TDATAMAP_T: 4
    2015-01-29 02:33:45,851 INFO  [AIF]: Number of Rows deleted from TDATASEG: 2
    2015-01-29 02:33:45,859 INFO  [AIF]: Number of Rows inserted into TDATASEG: 2
    2015-01-29 02:33:45,860 INFO  [AIF]: Number of Rows deleted from TDATASEG_T: 2
    2015-01-29 02:33:45,919 INFO  [AIF]: -------END IMPORT STEP-------
    2015-01-29 02:33:45,946 INFO  [AIF]: -------START VALIDATE STEP-------
    2015-01-29 02:33:45,993 INFO  [AIF]:
    Validate Data Mappings for Period 'Jan 15'
    2015-01-29 02:33:46,001 INFO  [AIF]: Total Data Rows available for Export to Target: 2
    2015-01-29 02:33:46,001 INFO  [AIF]:
    Validate Data Members for Period 'Jan 15'
    2015-01-29 02:33:46,002 INFO  [AIF]: Total Data Rows available for Export to Target: 2
    2015-01-29 02:33:46,026 INFO  [AIF]: -------END VALIDATE STEP-------
    2015-01-29 02:33:46,089 INFO  [AIF]: -------START EXPORT STEP-------
    2015-01-29 02:33:49,084 INFO  [AIF]: [HPLService] Info: Cube Name: RPDFN
    2015-01-29 02:33:49,084 INFO  [AIF]: [HPLService] Info: Export Mode: STORE_DATA
    2015-01-29 02:33:49,084 INFO  [AIF]: [HPLService] Info: updateMultiCurrencyProperties - BEGIN
    2015-01-29 02:33:49,532 INFO  [AIF]: [HPLService] Info: Currency Properties Exist for Planning Application: RPDPLN
    2015-01-29 02:33:49,534 INFO  [AIF]: [HPLService] Info: Number of existing multi-currency property rows deleted: 7
    2015-01-29 02:33:49,537 INFO  [AIF]: [HPLService] Info: Base Currency for Application 'RPDPLN': USD
    2015-01-29 02:33:49,542 INFO  [AIF]: [HPLService] Info: Number of multi-currency property rows inserted: 7
    2015-01-29 02:33:49,542 INFO  [AIF]: [HPLService] Info: updateMultiCurrencyProperties - END
    2015-01-29 02:33:49,543 INFO  [AIF]: Updated Multi-Curency Information for application:RPDPLN
    2015-01-29 02:33:49,543 INFO  [AIF]: Connecting to essbase using service user:admin
    2015-01-29 02:33:49,572 INFO  [AIF]: Obtained connection to essbase provider:Embedded
    2015-01-29 02:33:49,576 INFO  [AIF]: Obtained connection to essbase cube RPDFN
    2015-01-29 02:33:49,593 INFO  [AIF]: Locking rules file AIF0078
    2015-01-29 02:33:49,595 INFO  [AIF]: Successfully locked rules file AIF0078
    2015-01-29 02:33:49,595 INFO  [AIF]: Copying rules file AIF0078 for data load as AIF0078
    2015-01-29 02:33:49,609 INFO  [AIF]: Unlocking rules file AIF0078
    2015-01-29 02:33:49,611 INFO  [AIF]: Successfully unlocked rules file AIF0078
    2015-01-29 02:33:49,611 INFO  [AIF]: The data rules file has been created successfully.
    2015-01-29 02:33:49,617 INFO  [AIF]: Locking rules file AIF0078
    2015-01-29 02:33:49,619 INFO  [AIF]: Successfully locked rules file AIF0078
    2015-01-29 02:33:49,625 INFO  [AIF]: Load data into the cube by launching rules file...
    2015-01-29 02:33:50,526 INFO  [AIF]: The data has been loaded by the rules file.
    2015-01-29 02:33:50,530 INFO  [AIF]: Unlocking rules file AIF0078
    2015-01-29 02:33:50,532 INFO  [AIF]: Successfully unlocked rules file AIF0078
    2015-01-29 02:33:50,532 INFO  [AIF]: Executed rule file
    2015-01-29 02:33:50,572 INFO  [AIF]: [HPLService] Info: Creating Drill Through Region for Process Id: 621
    2015-01-29 02:33:51,075 INFO  [AIF]: [HPLService] Info: Drill Through Region created for Process Id: 621
    2015-01-29 02:33:51,076 INFO  [AIF]: [HPLService] Info: [loadData:621] END (true)
    2015-01-29 02:33:51,117 INFO  [AIF]: -------END EXPORT STEP-------
    2015-01-29 02:33:51,214 INFO  [AIF]: [HPLService] Info: [consolidateData:621,Jan 15] END (true)
    2015-01-29 02:33:51,264 INFO  [AIF]: -------START CHECK STEP-------
    2015-01-29 02:33:51,316 INFO  [AIF]: -------END CHECK STEP-------
    2015-01-29 02:33:51,413 INFO  [AIF]: FDMEE Process End, Process ID: 621

  • Do I need to compress video before importing into DVD SP?

    Hello,
    this is probably stupid question, but I would like to know, if I need to compress my video before I import it into DVD SP as an asset. It looks like everyone use Compressor before importing into DVD SP, (setting up number of passes, bit rates and so on), and then they do it again when setting up preferences in DVD SP. (setting 1 pass or 2passes, bit rate values)
    So is not the video actually compressed twice? Once in Compressor and once in DVD SP?
    And also, what video formats are actually OK to import into DVD SP? (QT, AVI, FCP movie, etc)
    Does DVD SP take them all, or does it only have to be QT?
    Thank you and I appologize for these novice questions.

    Welcome to the Boards
    Madagascar wrote:
    So is not the video actually compressed twice? Once in Compressor and once in DVD SP?
    And also, what video formats are actually OK to import into DVD SP? (QT, AVI, FCP movie, etc)
    Does DVD SP take them all, or does it only have to be QT?
    Thank you and I appologize for these novice questions.
    m2v will not be recompressed when placed on tracks (though in some instances it may recompress on menus.) Ultimately it is better to compress outside of DVD SP for control of encodes (and making AC3 files) and also some formats may not be supported by importing into DVD SP directly.
    Some information to take a look at
    http://dvdstepbystep.com/faqs_3.php
    http://dvdstepbystep.com/faqs_7.php
    http://dvdstepbystep.com/qc.php
    http://dvdstepbystep.com/fasttrackover.php (middle section discusses Compressor a bit more)

  • Check data before loading through SQL *Loader

    Hi all,
    I have a temp table which is loaded through SQL*Loader.This table is used by a procedure for inserting data into another table.
    I get error of 0RA-01722 frequently during procdures execution.
    I have decided to check for the error data through the control file itself.
    I have few doubts about SQL Loader.
    Will a record containing character data for column declared as INTEGER EXTERNAL in ctrl file get discarded?
    Does declaring column as INTERGER EXTERNAL take care of NULL values?
    Does a whole record gets discarded if one of the column data is misplaced in the record in input file?
    Control File is of following format:
    LOAD DATA
    APPEND INTO TABLE Temp
    FIELDS TERMINATED BY "|" optionally enclosed by "'"
    trailing nullcols
    FILEDATE DATE 'DD/MM/YYYY',
    ACC_NUM INTEGER EXTERNAL,
    REC_TYPE ,
    LOGO , (Data:Numeric Declared:VARCHAR)
    CARD_NUM INTEGER EXTERNAL,
    ACTION_DATE DATE 'DD/MM/YYYY',
    EFFECTIVE_DATE DATE 'DD/MM/YYYY',
    ACTION_AMOUNT , (Data:Numeric Declared:NUMBER)
    ACTION_STORE , (Data:Numeric Declared:VARCHAR)
    ACTION_AUTH_NUM ,
    ACTION_SKU_NUM ,
    ACTION_CASE_NUM )
    What changes do I need to make in this file regarding above questions?

    Is there any online document for this?<br>
    Here it is

  • Error when Read only permission set when filtering data before loading with Excel 2013 Addin

    Good afternoon :)
    I have an MDS issue that is making me lose my mind.
    I have some permission set to an Entity. It is a read only permission in the entity but I tried to put inside every field and same thing happen.
    Every time an Entity has any kind of read only permission assigned to it or its fields, Excel Addin show an error when we try to load it. When Entity has more rows than the maximum rows in the Settings pane, it will show you an option to filter data. When
    you try to use this filter, Excel show an error message but you can press OK and everything works fine.
    There is the message:
    The thing user, my user do not want it :( And I don't know how to get rid of it.
    Do anyone have an ideia on how to fix it ?
    In the debug set of the addin, there is this message:
    2014-10-22T11:38:42.152        8440 EXCEL.EXE            EXCEL.EXE                               
    Generic          EventType: Error, Message: Unobserved exception in TaskScheduler. Exception:'System.AggregateException: One or more errors occurred. ---> System.NullReferenceException: Object reference not set
    to an instance of an object.
       at System.Windows.Forms.Control.MarshaledInvoke(Control caller, Delegate method, Object[] args, Boolean synchronous)
       at System.Windows.Forms.Control.Invoke(Delegate method, Object[] args)
       at System.Windows.Forms.WindowsFormsSynchronizationContext.Send(SendOrPostCallback d, Object state)
       at Microsoft.MasterDataServices.ExcelAddInCore.ExcelHelpers.ExecuteOnUIThread(SendOrPostCallback callback)
       at Microsoft.MasterDataServices.ExcelAddInCore.DataView.FinalizeUIOperation(Boolean mdsOperation)
       at Microsoft.MasterDataServices.ExcelAddInCore.DataView.<>c__DisplayClass53.<LoadData>b__51(IAsyncResult ar)
       at System.Threading.Tasks.TaskFactory`1.FromAsyncCoreLogic(IAsyncResult iar, Func`2 endFunction, Action`1 endAction, Task`1 promise, Boolean requiresSynchronization)
       --- End of inner exception stack trace ---
    ---> (Inner Exception #0) System.NullReferenceException: Object reference not set to an instance of an object.
       at System.Windows.Forms.Control.MarshaledInvoke(Control caller, Delegate method, Object[] args, Boolean synchronous)
       at System.Windows.Forms.Control.Invoke(Delegate method, Object[] args)
       at System.Windows.Forms.WindowsFormsSynchronizationContext.Send(SendOrPostCallback d, Object state)
       at Microsoft.MasterDataServices.ExcelAddInCore.ExcelHelpers.ExecuteOnUIThread(SendOrPostCallback callback)
       at Microsoft.MasterDataServices.ExcelAddInCore.DataView.FinalizeUIOperation(Boolean mdsOperation)
       at Microsoft.MasterDataServices.ExcelAddInCore.DataView.<>c__DisplayClass53.<LoadData>b__51(IAsyncResult ar)
       at System.Threading.Tasks.TaskFactory`1.FromAsyncCoreLogic(IAsyncResult iar, Func`2 endFunction, Action`1 endAction, Task`1 promise, Boolean requiresSynchronization)<---

    Rafael,
    Is this still an issue?
    Thanks!
    Ed Price, Azure & Power BI Customer Program Manager (Blog,
    Small Basic,
    Wiki Ninjas,
    Wiki)
    Answer an interesting question?
    Create a wiki article about it!

Maybe you are looking for

  • IPod touch 4g won't stop syncing with iTunes

    I had a nearly 2 year old ipod touch 4g that when updated to ios 5, it froze...just days before the orginal warranty was about to expire.   apple store gave me a new ipod touch and they put ios 5 on it at the store.  when i got home and synced it to

  • I'm trying to "add a device."

    I'm not seeing any text on which to click in the sync window. What does it look like?

  • Is anyone running L8.0.2 with Leopard 10.5.3 on a G5?

    I re-installed Leopard and Logic updated both the the latest bu Logic pro quit every time. Apple What is the solution for this? Can I get my money back on Leopard? Eric D

  • Hundreds of ERROR = KEY_MISMATCH in emoms.trc

    I am pretty sure this is because the agent was not secured. I have 1000s of these errors in my log file and its cluttering my grid control logs. I can't find legitimate issues. Its another group, so I can't get on their server to shut down the agent.

  • Uninstall Old Versions of Flash  Player

    Hi all, Following installation of Flash Player 9.0.28.0 I find that there are still two down-rev versions on my system, 9.0.16.0 and version 4.0.7.0 of Micromedia Flash Player. How do I uninstall the older versions of Adobe Flash Player and replace o