FDM: Issues with importing mapping using excel files

Hi All,
I am trying to map ICP Entries using an excel file.
I have made 2 mapping (explicit) entries, manually and exported it in Excel and then added all the Mappings in the same file, in proper format. when I am importing, I am not getting any error but the mapping is not getting updated and the new entries are also not getting added. so Ideally no Change in the tDatamap table.
Environment details: FDM Version 11.1.1.3 is getting used. Target App is Essbase 11.1.1.3. able to connect to Essbase, no issues with connectivity.
Has anyone face this issue.

The section of the admin guide I reference will tell you how to create a properly formatted excel workbook. Import XLS will work if the excel workbook is set up properly. The import XLS functionality will not change existing records, only add new ones.
Your template requires that the first cell is the table name. The second row is the table field names and rows 3 through X are the values to be inserted into the table. The named range beginning with UPS needs to highlight all of these rows and columns.
Quite honestly, if you take the time to review the admin guide, I'm sure you'll find the answer that you need. Please understand that, like you, most of us that post on this board are consultants. We share knowledge to help the community better utilize the product. It's frustrating to hear that none of the answers that I previously provided you were at all helpful.
Edited by: TonyScalese on Nov 30, 2010 3:23 PM

Similar Messages

  • Import map using excel document.

    Hello Experts,
    I built an import map to run it manually with import manager, but when I click on Ready to Import, the import manager is doing one record at a time. I have to go back to the match tab and refresh in order to import the rest.
    Version 7.1
    Using excel document
    I can see all the records as "create" but when click on the icon to import records it only import one at a time, has to come back to the match tab -> refres -> import again
    Any idea why this might be happening?
    Thank you very much for your help,
    Claudia Hardeman

    Hi Claudia,
    Unfortunately this information is not sufficient enough.. Please can you tell your MDM server version, In which table type you are trying to import data, record matching field types and logic (if multiple fields), version of MS excel.. This will help..
    Meanwhile, please have a look at the below SAP note, if this is applicable to you
    "Note 1575981 - Import manager only imports single records"..
    Regards,
    Shiv

  • Issues with external table from excel file

    dear all,
    i have been trying to use the below statement to create an external table.this table is referencing an excel file.
    CREATE TABLE EMPFRAUD_TEST
    SERIAL_NUM VARCHAR2(10 BYTE),
    BRANCH_CODE VARCHAR2(10 BYTE),
    BUSINESS_ADD VARCHAR2(100 BYTE),
    REGIONS VARCHAR2(50 BYTE),
    TRANSACTION_DATE_TIME DATE,
    REPORT_DATE_TIME DATE,
    NO_OF_TRANS VARCHAR2(4 BYTE),
    AMOUNT NUMBER,
    FRAUD_TYPE VARCHAR2(25 BYTE),
    IMPACT_CATEGORY VARCHAR2(10 BYTE)
    ORGANIZATION EXTERNAL
    ( TYPE ORACLE_LOADER
    DEFAULT DIRECTORY EXT_EMP_TEST
    ACCESS PARAMETERS
    ( records delimited by newline
    badfile 'empfraud%a.bad'
    logfile 'empfraud%a.log'
    fields terminated by ','
    optionally enclosed by '"'lrtrim
    missing field values are null
    LOCATION ('fraud.csv')
    REJECT LIMIT UNLIMITED
    NOPARALLEL
    NOMONITORING;
    the problems is as follows
    1) when i run the query above the table will be created,
    but when i try to select from the table,an empty table will be display.
    when i checked the error log file,the following message was given.
    it was gotten from an oracle db on unix server.
    "L_NUM
    KUP-04036: second enclosing delimiter not found
    KUP-04101: record 71 rejected in file /home/oracle/ext_folder_test/fraud.csv
    KUP-04021: field formatting error for field ACCOUNT_KEY
    KUP-04036: second enclosing delimiter not found
    KUP-04101: record 79 rejected in file /home/oracle/ext_folder_test/fraud.csv
    KUP-04021: field formatting error for field SERIAL_NUM
    KUP-04036: second enclosing delimiter not found
    KUP-04101: record 80 rejected in file /home/oracle/ext_folder_test/fraud.csv
    error processing column TRANSACTION_DATE_TIME in row 1 for datafile /home/oracle/ext_folder_test/fraud.csv
    ORA-01858: a non-numeric character was found where a numeric was expected
    error processing column TRANSACTION_DATE_TIME in row 2 for datafile /home/oracle/ext_folder_test/fraud.csv
    ORA-01843: not a valid month
    error processing column TRANSACTION_DATE_TIME in row 3 for datafile /home/oracle/ext_folder_test/fraud.csv
    ORA-01843: not a valid month
    error processing column TRANSACTION_DATE_TIME in row 8 for datafile /home/oracle/ext_folder_test/fraud.csv
    ORA-01843: not a valid month
    error processing column TRANSACTION_DATE_TIME in row 9 for datafile /home/oracle/ext_folder_test/fraud.csv
    ORA-01843: not a valid month
    error processing column TRANSACTION_DATE_TIME in row 10 for datafile /home/oracle/ext_folder_test/fraud.csv
    ORA-01843: not a valid month
    error processing column TRANSACTION_DATE_TIME in row 11 for datafile /home/oracle/ext_folder_test/fraud.csv
    ORA-01858: a non-numeric character was found where a numeric was expected
    error processing column TRANSACTION_DATE_TIME in row 12 for datafile /home/oracle/ext_folder_test/fraud.csv
    ORA-01843: not a valid month
    error processing column TRANSACTION_DATE_TIME in row 13 for datafile /home/oracle/ext_folder_test/fraud.csv
    ORA-01843: not a valid month
    error processing column TRANSACTION_DATE_TIME in row 14 for datafile /home/oracle/ext_folder_test/fraud.csv
    ORA-01843: not a valid month
    error processing column TRANSACTION_DATE_TIME in row 15 for datafile /home/oracle/ext_folder_test/fraud.csv
    ORA-01843: not a valid month"
    pls i need help to resolve it fast
    thank
    regards
    ajani abdulrahman olayide
    NB:
    after conversion to .csv format,
    BELOW IS THE DATA I AM TRYING TO ACCESS FROM THE EXCEL FILE
    BUSINESS OFFICE,REGIONS,Transaction_Date_Time,Report_Date_Time,Account_Key (Account No),     Number_of_Transactions,Total_Amount (N)      
    1,162,9 ojo street,Lagos South,various ,17/01/10,16200388749987,1,5100000,CHEQUE
    2,0238,"10 cyril Road, Enugu",East,21/06/2006,23/12/10,020968765357 09867653920174,1,20000000
    3,0127,"261, obiageli Rd, Asaba",Mid-West,22/12/2010,23/12/10,'00160030006149,1,6000000
    4,0519,"just road, Onitsha
    ",East,12/03/2010,14/02/11,0896002416575,1,5000000
    5,0519,"just road, Onitsha
    ",East,03/12/2010,14/02/11,06437890134356,1,5000000
    6,149,olayide street,Lagos South,10/02/2010,17/02/11,NGN01492501036 ,1,6108950
    7,0066,wale,Mid - west,18/02/2011,18/02/10,'05590020002924,1,55157977.53
    8,66,john,Mid- west,11/03/2010,14/03/09,'00660680054177,1,6787500
    9,0273,waheed Biem,N/Central,Jan 09 to Dec 2010,01/04/11,Nil,1,14146040

    As others suggested, you have to do the debugging yourself, To avoid the date error you may need something like this:
    CREATE TABLE EMPFRAUD_TEST
        SERIAL_NUM   VARCHAR2(10),
        BRANCH_CODE  VARCHAR2(10),
        BUSINESS_ADD VARCHAR2(100),
        REGIONS      VARCHAR2(50),
        TRANSACTION_DATE_TIME DATE ,
        REPORT_DATE_TIME DATE ,
        NO_OF_TRANS     VARCHAR2(50),
        AMOUNT          NUMBER,
        FRAUD_TYPE      VARCHAR2(25),
        IMPACT_CATEGORY VARCHAR2(10)
      ORGANIZATION EXTERNAL
        type oracle_loader default directory saubhik
        access parameters
        ( records delimited by newline
          badfile 'empfraud%a.bad'
          logfile 'empfraud%a.log'
          skip 1
          fields terminated by ','
          optionally enclosed by '"' ltrim
          missing field values are null
           ( serial_num ,
             branch_code ,
            business_add ,
            regions ,
            transaction_date_time date "dd/mm/rrrr",
            report_date_time date "dd/mm/rr",
            no_of_trans ,
            amount ,
            FRAUD_TYPE ,
            IMPACT_CATEGORY ) ) LOCATION ('fraud.csv')
      REJECT LIMIT UNLIMITED
    {code}                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                               

  • Issue with importing data using data pump

    Hi Guys,
    Need your expertise here. I have just exported a table using the following datapump commands. Please note that I used *%U* to split the export into chunk of 2G files.
    expdp "'/ as sysdba'" dumpfile=DP_TABLES:PT_CONTROL_PS_083110_*%U*.dmp logfile=DP_TABLES:PT_CONTROL_PS_083110.log tables=(PT_CONTROL.pipeline_session) filesize=2G job_name=pt_ps_0831_1
    The above command produced the following files
    -rw-r----- 1 oracle oinstall 2.0G Aug 31 15:04 PT_CONTROL_PS_083110_01.dmp
    -rw-r----- 1 oracle oinstall 2.0G Aug 31 15:05 PT_CONTROL_PS_083110_02.dmp
    -rw-r----- 1 oracle oinstall 2.0G Aug 31 15:06 PT_CONTROL_PS_083110_03.dmp
    -rw-r----- 1 oracle oinstall 394M Aug 31 15:06 PT_CONTROL_PS_083110_04.dmp
    -rw-r--r-- 1 oracle oinstall 2.1K Aug 31 15:06 PT_CONTROL_PS_083110.log
    So far things are good.
    Now when I import the data using the below command, it truncates the table but do no import any data. Last line says "*Job "SYS"."PT_PS_IMP_0831_1" completed with 1 error(s) at 15:14:57*".
    impdp "'/ as sysdba'" dumpfile=DP_TABLES:PT_CONTROL_PS_083110_%U.dmp logfile=DP_TABLES:PT_CONTROL_PS_083110_IMP.log Tables=(PT_CONTROL.pipeline_session) TABLE_EXISTS_ACTION=Truncate job_name=PT_ps_imp_0831_1
    Import: Release 10.2.0.3.0 - Production on Tuesday, 31 August, 2010 15:14:53
    Copyright (c) 2003, 2005, Oracle. All rights reserved.
    Connected to: Oracle Database 10g Release 10.2.0.3.0 - Production
    Master table "SYS"."AT_PS_IMP_0831_1" successfully loaded/unloaded
    Starting "SYS"."AT_PS_IMP_0831_1": '/******** AS SYSDBA' dumpfile=DP_TABLES:PT_CONTROL_PS_083110_*%U*.dmp logfile=DP_TABLES:PT_CONTROL_PS_083110_IMP.log Tables=(PT_CONTROL.pipeline_session) TABLE_EXISTS_ACTION=Truncate job_name=AT_ps_imp_0831_1
    Processing object type TABLE_EXPORT/TABLE/TABLE
    ORA-39153: Table "PT_CONTROL"."PIPELINE_SESSION" exists and has been truncated. Data will be loaded but all dependent metadata will be skipped due to table_exists_action of truncate
    Processing object type TABLE_EXPORT/TABLE/TABLE_DATA
    Processing object type TABLE_EXPORT/TABLE/GRANT/OWNER_GRANT/OBJECT_GRANT
    Processing object type TABLE_EXPORT/TABLE/INDEX/INDEX
    Processing object type TABLE_EXPORT/TABLE/CONSTRAINT/CONSTRAINT
    Processing object type TABLE_EXPORT/TABLE/INDEX/STATISTICS/INDEX_STATISTICS
    Processing object type TABLE_EXPORT/TABLE/TRIGGER
    Processing object type TABLE_EXPORT/TABLE/STATISTICS/TABLE_STATISTICS
    Job "SYS"."AT_PS_IMP_0831_1" completed with 1 error(s) at 15:14:57
    I suspect that it has something to do with %U in the impdp command. Anyone encounter this kind of situation before? What should be my import command be? Just want to confirm I am using the right import command.
    Thanks
    --MM
    Edited by: UserMM on Aug 31, 2010 3:11 PM

    I also looked into the alert log but didn't find anything about the error there. Any opinion?
    --MM                                                                                                                                                                                                           

  • Import of  big Excel files

    Hi,
    Lookkin for help with import of big Excel file (40,000 records) into a 10g database.
    Each time I'm trying to import big file SQL developer turns non responsive. I have to use Task maneger to end the program.
    Thanks
    Dov

    How long did you wait? Unresponsive usually is just busy.
    Can you verify activity e.g. in the Enterprise Manager?
    K.

  • Issues with importing from excel

    I have been running into a several issues with importing from Excel.
    First my configuration
    I am running SQL Developer ver 1.5.5 Build MAIN-5969
    I am on a Windows XP Professional Version 2002 with Service Pack 3
    I am importing into an Oracle 10g database.
    1. SQL Developer doesn't work on Excel 2007, I have to save all my files into Excel 97-2003 format before I can use them.
    2. If I run into an error loading and stop the process, SQL Developer doesn't release the Excel file and I get a sharing violation if I try to save the spreadsheet without closing SQL Developer.
    3. I have found that I have to set print area to the area I want to work with, otherwise the import wizard tries to keep adding rows.
    4. When using the Import wizard, it keeps adding commas on fields with numerics unless I specify the column in excel as text. Currently the column is formatted as General in the spreadsheet or I can change the wizard format to say the column is an integer, but it actually is just a code field with numeric characters so it may have leading zeroes that I need to keep.
    This might be related,
    I have a column in excel defined as text but only contains numerics. It is of length 4, but the wizard sets a precision of 5 with a datatype of VARCHAR2. If I try to change it to 4, I get an error saying the field is not large enough. Yet, when I do a LEN on the column, it only gives me a 4. I have other fields in the same sheet that a 3 position numeric and 2 position numeric and those are fine. I am thinking this is related to the comma being inserted in a numeric field for anything greater than 3 positions.
    5. Importing excel dates to oracle dates doesn't work. I have to convert the excel date column to text then import as a VARCHAR, then convert to Date once in the database.
    6. The default of nullible is not set on any columns and I have to set them before the import. (I would prefer it set to nullible and I have to uncheck the box to make it not nullible. I would prefer to import all of the data and then deal with the nulls after they have been pulled in)
    7. When I select header columns included it uses that as the column names. Is it possible to do the name length check then? It has bit me a few times where I try to import and forget to check the name length and then I get an error when I start running the import.
    8. If one of the columns to import has all nulls, then the precision comes out to 0 and if it isn't changed an error occurs on import. Could this situation default to a precision of 1?
    9. It would be nice if there was a completion message displayed and a cancel option while running.

    On point 3.
    I have a column in excel that consists of numbers. 4 digit numeric codes. Ex, 1111, 2345, etc
    The column's format is general. It displays as just 4 numbers.
    When I start the wizard initially, the column appears with data as 1,111, 2,345, etc, on the Data Preview screen.
    It determines the precision to be 5 on the column definition screen.
    If I change the precision to 4 then continue, that field will error out when I verify with "not big enough to hold the data in source columns"
    If, I change the excel column to a TEXT column.
    Excel still displays as 1111, 2345, etc
    The wizard then displays the same data 1111, 2345 on the Data Previeiw screen
    Yet, when I get to the column definition screen it still sizes it as a 5
    If I change it to a 4, I get the same error when verifying.
    If I leave them alone as 5, then it processes just fine.

  • How can i create  excel sheet with multiple tabs using utl file?

    how can i create excel sheet with multiple tabs using utl file?
    any one help me?

    Jaggy,
    I gave you the most suitable answer on your own thread yesterday
    Re: How to Generating Excel workbook with multiple worksheets

  • I've been using LR with my Nikon D3200 for a year or so. Shooting in RAW/NEF no issues with import etc until I updated LR and now I can't get LR to allow me to select when I try to import. It does show the images but they're grayed out and not able to be

    I've been using LR with my Nikon D3200 for a year or so. Shooting in RAW/NEF no issues with import etc until I updated LR and now I can't get LR to allow me to select when I try to import. It does show the images but they're grayed out and not able to be selected. Any thoughts? TIA

    Greyed imaged in the Import dialog box mean you have already imported the photos into Lightroom, so there is no need and no benefit to importing them a second time.
    Just go to the Library module, search for the desired photos, and resume working on these photos.

  • Anyone having issues with importing CR2 files into lightroom 5 as error message comes up saying "Some import operations were not performed". please advise what is a solution please

    Urgent please
    anyone having issues with importing CR2 files into lightroom 5 as error message comes up saying "Some import operations were not performed". please advise what is a solution please

    Sounds like the folder Write permissions issue described here with a solution:
    "Some import operations were not performed" from camera import

  • How do I import Map Info Tab files into Spatial for a map of europe?

    How do I import Map Info Tab files into Spatial for a map of europe via FME and have oracle spatial draw the map without problems?
    So far I've got to the stage where I can import the data, spatially index it (in oracle 9i) and get my SVG (scaleable vector graphics) application to view the map.
    The problem is that countries that have more than one polygon (more than one row in the database) do not draw properly.
    When I view the Map Info tab file in the FME viewer I can see that the data is fine, but I can also see that some of the polygons used to draw a country are donugts and some aren't.
    This seems to cause a problem when I import the data into oracle spatial as I don't know if a row in the table needs to be inserted as an independent SDO_GEOMETRY or if it should form part of a larger SDO_GEOMETRY (as in 2 or more rows make up the polygon shape for a country).
    I have a feeling that I'm not using FME correctly, because at the moment I have to import the tab file into Oracle then re-insert the data into a spatially formatted table (one with a spatial index) - I get the impression that FME should do that for me but as I'm new to this I don't really know.
    Any Help welcome :|
    Tim

    Tim,
    MapInfo has a free utility called EasyLoader that allows you to upload a table directly to Oracle. EasyLoader creates the geometries and spatial index. You can download it free from http://www.mapinfo.com/products/download.cfm?ProductID=1044
    Andy Greis
    CompuTech Inc.

  • How to create the Export Data and Import Data using flat file interface

    Hi,
    Request to let me know based on the requirement below on how to export and import data using flat file interface.....
    Please provide the steps involved for the same.......
    BW/BI - Recovery Process for SNP data. 
    For each SNP InfoProvider,
    create:
    1) Export Data:
    1.a)  Create an export data source, InfoPackage, comm structure, etc. necessary to create an ASCII fixed length flat file on the XI
    ctnhsappdata\iface\SCPI063\Out folder for each SNP InfoProvider. 
    1.b)  All fields in each InfoProvider should be exported and included in the flat file. 
    1.c)  A process chain should be created for each InfoProvider with a start event. 
    1.d)  If the file exists on the target drive it should be overwritten. 
    1.e)  The exported data file name should include the InfoProvider technical name.
    1.f)  Include APO Planning Version, Date of Planning Run, APO Location, Calendar Year/Month, Material and BW Plant as selection criteria.
    2) Import Data:
    2.a) Create a flat file source system InfoPackage, comm structure, etc. necessary to import ASCII fixed length flat files from the XI
    ctnhsappdata\iface\SCPI063\Out folder for each SNP InfoProvider.
    2.b)  All fields for each InfoProvider should be mapped and imported from the flat file.
    2.c)  A process chain should be created for each InfoProvider with a start event. 
    2.d)  The file should be archived in the
    ctnhsappdata\iface\SCPI063\Archive directory.  Each file name should have the date appended in YYYYMMDD format.  Each file should be deleted from the \Out directory after it is archived. 
    Thanks in advance.
    Tyson

    Here's some info on working with plists:
    http://developer.apple.com/documentation/Cocoa/Conceptual/PropertyLists/Introduc tion/chapter1_section1.html
    They can be edited with any text editor. Xcode provides a graphical editor for them - make sure to use the .plist extension so Xcode will recognize it.

  • ExcelXML mapping---problem with XML maps in Excel sheet

    Hi Friends,
    I have one issue with ExcelXML mapping in Xcelsius.
    The problem is I have designed one dashboard using ExcelXML mapping and everything is working fine but I was afraid that  I could not able to find the mappings which were embedded in Excel.It happened many times.What I was doing is everytime Im re-mapping.I could be a big problem for me to do this procedire for everytime.How to recover my XML maps into excel sheet.Can anyone please provide the solution to achieve this.

    Shouldnt it be equivalent ? I mean, as far as I know the ns0: shouldnt be a problem
    when you have a namespace in the message then you need to associate it with some prefix....since ns0 (or any other prefix) is not present you are getting the error....having the namespace but not ns0 is the problem.
    XMLAnonymizer bean may help you to add the namespace prefix...

  • Import data from excel file

    Hello.
    Is anybody can help how to import data from excel file to the form created with designer 7.0. Originally there is a script inside the form to populate drop down list and depending from data selected in the ID number drop down list, there will be filled out the description and the prices text fields. But now I have to modify this form with data from excel file, which has more than 30000 lines and put all this data to script is too much.
    So, can somebody know how can I after filling the ID number field , populate the description and price text fields with data from excel file corresponding to this ID number ?
    This form is used in Adobe reader.
    Any comments are welcome.
    Regards,
    Aivar

    Hi
    That's what i said in my prev. Post to clear cache... :)
    disable your cache from nqsconfig.ini
    In cache section of NQSConfig file,
    you find
    ENABLE     =     YES;
    set to NO
    OR
    if you are using data ware house as the source for OBIEE,
    you know that when ETL is done, so just create iBot to purge cache automatically at that particular intervals,
    So that report runs freshly at that time
    And what happened to your View Selector question?
    Edited by: Kishore Guggilla on Jul 3, 2009 3:52 PM

  • RC 8.6 Issues with Nikon D810 NEF RAW files

    RC 8.6 does not correctly process Nikon D810 NEF RAW files. See the following blog article for some details:
    Nikon D810 vs D800E ISO Comparison

    This is very helpful. My current workflow, until ACR 8.6 and LR fully support the Nikon D810, are to use the DNG converter and import the resulting D810 DNG files into my LR catalog.
    I have not yet tried this (traveling right now), but I assume I can still make a custom profile using the X-Rite ColorChecker Passport and it's associated software for my D810 using the RC 8.6 DNG files. This is my normal workflow for all of my other cameras in LR, but the process is direct from native RAW and not DNG (to produce the custom profile).
    Sent from my iPad
    On Jul 28, 2014, at 6:51 PM, "max.wendt" <[email protected]> wrote:
    RC 8.6 Issues with Nikon D810 NEF RAW files
    created by max.wendt in Adobe Camera Raw - View the full discussion
    Until Lightroom gets D810 support, you will need to use the ACR plugin in order to see the profiles. The DNG Converter always uses Adobe Standard, and embeds it into the DNG. Lightroom 5.5 doesn't have the D810 profiles.
    Please note that the Adobe Forums do not accept email attachments. If you want to embed a screen image in your message please visit the thread in the forum to embed the image athttps://forums.adobe.com/message/6592335#6592335
    Replies to this message go to everyone subscribed to this thread, not directly to the person who posted the message. To post a reply, either reply to this email or visit the message page: [https://forums.adobe.com/message/6592335#6592335]
    To unsubscribe from this thread, please visit the message page at [https://forums.adobe.com/message/6592335#6592335]. In the Actions box on the right, click theStop Email Notifications link.
    Start a new discussion in Adobe Camera Raw by email or at Adobe Community
    For more information about maintaining your forum email notifications please go tohttp://forums.adobe.com/thread/416458?tstart=0.

  • A strange issue with importing avi video

    I have a rather strange issue with importing avi files (which playback normally in other programs) into CS4. When I import an avi recorded from my camera into CS 4 in the preview the sound is played back normally, but the video itself is weird. Pretty much roughly first 5% of it are stretched over the whole spand and slowed down to fit. So the video of 4 minutes will play sound normally for 4 minutes, but replay only 10 seconds of video VERY slowly. I'm a bit puzzled because I've used the same camera with CS3 and never encountered this problem. breaking apart and speeding up the video obviously didn't work because the video only displays the first 5 recorded seconds no matter the speed, tweaking with presets didn't work either.

    subtlemolotov wrote:
    I obviously know it's a photo camera, I own it.
    The point is it's something we needed to know to answer your question.
    subtlemolotov wrote:
    Well I've found a very unprofessional solution.
    "An unprofessional solution?" Kind of like shooting video with an still camera.....

Maybe you are looking for