Cannot extract dimension .csv files from HFM to be able to use maploader

Hi All!
We are in the middle of implementing FDM at a financial consolidation project. During this build we would like to use FDM to load HFM with data.
To help the users we would like to offer them the Maploader.xml which is provided in the mapping templates.zip
To prepare this maploader file we need to extract the dimension data from HFM into csv files and load these into the maploader.
Can anyone help me with extracting the .csv files? The script we use creates the csv files but does not load them with the metadata.
Maybe it is possible to get the right script? Or is this already available in FDM (can't find it)?
Thank you in advance for your support!!

Anyone got any tips or directions to solve this?

Similar Messages

  • Copy CSV file from windows to linux

    Our oracle is installed on UNIX machine so we have to place CSV file there.
    We are on windows.We are using Putty client to access UNIX server.
    how to copy CSV file from windows to UNIX server?
    Thanks

    vis1985 wrote:
    Our oracle is installed on UNIX machine so we have to place CSV file there.
    We are on windows.We are using Putty client to access UNIX server.
    how to copy CSV file from windows to UNIX server?Using scp (secure copy).
    It is supported by Putty and ssh services are available by default on most (if not all) modern Unix servers.
    Don't have a Windows machine close by, but I recall the scp command is supported by the Putty executable called "+pscp.exe+".
    The syntax is very simple:
    scp <from_destination> <to_destination>
    example (pushing a file from Windows to Oracle Unix server)
      scp c:\datafiles\accounts.csv [email protected]:/u01/files/accounts.csv This will prompt for a password.
    You also can use ssh trusted certificates (RSA or DSA keys) to automate this. Using Putty you can generate a key pair that consists of a private key and public key.
    You copy and paste the public key into the destination server's +$HOME/.ssh/authorized_keys+ file. This results the server in trusting that client and allows the client to connect (using ssh, scp and sftp) without having to supply a password (as the private part of the key serves as authentication). (I would however not use the oracle account for this, but a special scp-copy usage Unix account that does not support shell usage)
    If you want to automate it the other way around - have the Unix server pull the file from the Windows client, then you need to install server software on that Windows machine to allow it to provide such a file copy service.
    Again I suggest using ssh - OpenSSH is available for a variety of platforms, and Windows should be included. If not, there should be ssh service for Windows from Microsoft or another vendor that can be used.
    Any other method than ssh will be either insecure or problematic. FTP is insecure as usernames and passwords are transmitted in clear text. Using Windows File Sharing requires Samba to be loaded and configured on the Unix server - and this is extra work installing and configuring and testing s/w that uses a low level security, that is also proprietary and could required setting on the Windows client also to make it work in a somewhat robust fashion.
    The standard today is ssh. It is robust and secure. And it makes a lot of sense to use it.. and pretty good motivation for not using it ifo something else.

  • Accessing CSV File from URL

    Hi Experts,
    I developing an interface where it need access an CSV file from an URL.
    The url is: http://200.218.208.119/download/fechamento/20100413.csv
    The file is modified every day, then yesterday this file will be: 20100414.csv
    My interface generate the following erros
      <Trace level="1" type="T">---- Plain HTTP Adapter Outbound----</Trace>
      <Trace level="1" type="T">---------------------------------------------</Trace>
    - <Trace level="1" type="B" name="CL_HTTP_PLAIN_OUTBOUND-ENTER_PLSRV">
      <Trace level="3" type="T">Quality of Service BE</Trace>
      <Trace level="1" type="T">Get XML-Dokument from the Message-Objekt</Trace>
      <Trace level="3" type="T">URL http://200.218.208.119:80/download/fechamento/20100413.csv</Trace>
      <Trace level="3" type="T">Proxy Host:</Trace>
      <Trace level="3" type="T">Proxy Service:</Trace>
      <Trace level="3" type="T">~request_method POST</Trace>
      <Trace level="3" type="T">~server_protocol HTTP/1.0</Trace>
      <Trace level="3" type="T">accept: */*</Trace>
      <Trace level="3" type="T">msgguid: A7031081480011DFA526001517D1434C</Trace>
      <Trace level="3" type="T">service: D0B_100</Trace>
      <Trace level="3" type="T">interface namespace: http://bcb.gov.br/xi/OB02</Trace>
      <Trace level="3" type="T">interface name: MI_RFC_OUT</Trace>
      <Trace level="3" type="T">Header-Fields</Trace>
      <Trace level="3" type="T">Prolog conversion Codepage: UTF-8</Trace>
      <Trace level="3" type="T">Epilog conversion Codepage: UTF-8</Trace>
      <Trace level="3" type="T">content-length 133</Trace>
      <Trace level="3" type="T">content-type: text/csv; charset=UTF-8</Trace>
      <Trace level="2" type="T">HTTP-Response :</Trace>
      <Trace level="1" type="T">Method Not Allowed</Trace>
      <Trace level="2" type="T">Code : 405</Trace>
      <Trace level="2" type="T">Reason: Method Not Allowed</Trace>
      <Trace level="2" type="T">HTTP-response content-length 6261</Trace>
    can I help?

    > I developing an interface where it need access an CSV file from an URL.
    This is not supported in PI standard.
    I think the option will be available in PI 7.3, but I cannot promise it.

  • Inserting Record into CSV file from BizTalk Orchestration

    Scenario:
    1.Receive file from Source system via RecvPipeline
    2.In Orchestration  extracting some values like ENO,Ename,Salary etc.these values to be added in to CSV file from Expression Shape.How to append/add emp records in to CSV with out overriding the rows.
    Ex:If we submitted 10 files then the CSV file should contain 10 rows in CSV.
    Let me know how to create CSV file from Orchestration and how to add rows into that csv value
    Regards BizTalkWorship

    Simple.
    Receive the message through a Receive Port/Location.
    Create a flat-file schema representing the CSV file structure. Ensure each row is delimited by “{CR}{LF}”. 
    This flat-file schema should only contain the element which you want to see in the destination CSV file like ENO,Ename,Salary etc.
    Have a map where the source schema should be the one which represents the received file and destination schema should be the one which is above created flat-file schema.
    Map the source schema to the destination schema mapping the filed 
    ENO,Ename,Salary etc.
    Have a custom send pipeline with flat-file assembler component it. Use this send pipeline in the send port.
    In send port, configure the send filter like “BTS.ReceivePortName == YourReceivePortName”. Configure the send port’s “Outbound Maps” to the map which you have created in
    above step
    Key Point. In your send port, set the “Copy Mode” property to “Append” from default “Create New”
    With your send port’s, “Copy Mode” property configured to “Append” this will append the value of the output to the existing file. Since in your flat-file schema, each record
    is delimited by “{CR}{LF}” and since you’re overwriting the output file you will have one file with records appended. So if 10 files received, instead of 10 output files, you will have 1 CVS file with 10 rows.
    If you want to construct the message in Orchestration as do, you do as opposed to map in send port at outbound map you can still do.
    If this answers your question please mark it accordingly. If this post is helpful, please vote as helpful by clicking the upward arrow mark next to my reply.

  • Delete a .csv file from desktop system

    Hi All,
    My requirement is to read the .csv file from the desktop system having the shared folder and delete the file after read successfully.
    Here I can read the .csv file from the location using the function RFC_REMOTE_FILE and updated the content into internal table.
    But I cant delete the file from the presentation server ( Desktop system).
    Can anyone tell me how to delete the .csv file from the desktop system on different location.
    Note:
    I followed this link to read file:
    http://www.sdn.sap.com/irj/scn/index?rid=/library/uuid/9831750a-0801-0010-1d9e-f8c64efb2bd2&overridelayout=true

    Hi Rob,
    Thanks. I solved this problem myself.
    The solution to delete the file from remote system is
    concatenate 'DEL' i_filename i_dirname into v_bkfile separated by space .
    call function 'RFC_REMOTE_EXEC'
      destination  c_dest
      exporting
        command               = v_bkfile
      exceptions
        system_failure        = 1  MESSAGE v_ermsg
        communication_failure = 2  MESSAGE v_ermsg.

  • Error loading csv file from application server

    Hi all,
    While uploading a csv file from the application server to psa we are getting the following error,
    Error 2 while splitting CSV data record
    Message no. RSDS_ACCESS011
    Diagnosis
    Error 2 occurred while splitting the CSV data record 1
    1 = Could not find a closing escape character
    2 = Invalid escape character
    3 = Conversion error
    4 = Other error
    System Response
    The function was terminated.
    Procedure
    Check the values of the data separator and escape sign, and try again.
    But i've checked the file and the escape sign, data seperator in it also. Everything is fine.  The same file we are able to load successfully in quality system.
    How to solve this error??
    Thanks in advance.

    Hi BI consultant:
       Could you please provide more details?
    For example:
    1.Is your P application server a UNIX flavor? (Solaris, AIX, UX, Linux)
       If yes..
             2. Are you able to see the contents of the file correctly with a "cat" or "vi" command? (at operating system level).
                   If no...
                         3. Did you upload the csv flat file to the server via FTP?
                                If yes...
                                     4. Did you use the "binary" or the "ascii" parameter on the FTP command used to upload the file?
    Probably you need to upload the CSV file again to your application server and make sure you can se the file contents ("cat" or "vi" command) before trying to execute the InfoPackage.
    Regards,
    Francisco Milán.
    Edited by: Francisco Milan on Jun 3, 2010 11:13 AM

  • Download CSV-File from Interactive Report

    Dear all,
    I want to download a CSV-File from a Interactive report.
    The content of the Rows in Excel:
    Erste,"B","C","D","E","F","G","H","I","J","K","L","M","N","O","P"
    IBC ->,"-","1","2","3","4","5","6","7","8","9","10","11","12","13","14" [...]
    How can I solve this problem?
    Thanks for help.

    Your CSV-File use "comma" as separator. But Excel think, that CSV-File separator is "semicolon".
    Try to set "CSV Separator"=";" in section "Report Attributes" - "Download".

  • Extracting a flat file from oracle table

    I have moved the knowledge module KIM ISO SQL to FileAppend from the Metadata to my project folder.
    But when I create an interface mapping the oracle table and a flat file on a different unix server, in the drop down menu , it shows only KIM SQL TO SQL and KIM Control Append.It does not show up the SQL to FileAppend knowledge module option.
    What should I do to extract a flat file from oracle table?
    Thanks
    Hima
    Overstock.com

    All IKM in the Drop Down Menu are dependent of the target technology.
    A question, at this interface, is your target table a file ?

  • Reading a CSV file from server

    Hi All,
    I am reading a CSV file from server and my internal table has only one field with lenght 200. In the input CSV file there are more than one column and while splitting the file my internal table should have same number of rows as columns of the input record.
    But when i do that the last field in the internal table is appened with #.
    Can somebody tell me the solution for this.
    U can see the my code below.
    data: begin of itab_infile occurs 0,
             input(3000),
          end of itab_infile.
    data: begin of itab_rec occurs 0,
             record(200),
          end of itab_rec.
    data: c_comma(1) value ',',
            open dataset f_name1 for input in text mode encoding default.
            if sy-subrc <> 0.
              write: /, 'FILE NOT FOUND'.
              exit.
            endif.
    do
      read dataset p_ipath into waf_infile.
      split itab_infile-input at c_sep into table itab_rec.
    enddo.
    Thanks in advance.
    Sunil

    Sunil,
    You go not mention the platform on which the CSV file was created and the platform on which it is read.
    A common problem with CSV files created on MS/Windows platforms and read on unix is the end-of-record (EOR) characters.
    MS/Windows usings <CR><LF> as the EOR
    Unix using either <CR> or <LF>
    If on unix open the file using vi in a telnet session to confirm the EOR type.
    The fix options.
    1) Before opening the opening the file in your ABAP program run the unix command dos2unix.
    2) Transfer the file from the MS/Windows platform to unix using FTP using ascii not bin.  This does the dos2unix conversion on the fly.
    3) Install SAMBA and share the load directory to the windows platforms.  SAMBA also handles the dos2unix and unix2dos conversions on the fly.
    Hope this helps
    David Cooper

  • Stored Proc to create .csv file from table

    Hi all,
    I have a table with 4 columns and 10 rows. Now I need a stored proc to create a .csv file from the table data. Here the scripts to create a table and insert the row's.
    Create table emp(emp_id number(10), emp_name varchar2(20), department varchar2(20), salary number(10));
    Insert into emp values ('1','AAAAA','SALES','10000');
    Insert into emp values ('2','BBBBB','MARKETING','8000');
    Insert into emp values ('3','CCCCC','SALES','12000');
    Insert into emp values ('4','DDDDD','FINANCE','10000');
    Insert into emp values ('5','EEEEE','SALES','11000');
    Insert into emp values ('6','FFFFF','MANAGER','90000');
    Insert into emp values ('7','GGGGG','SALES','12000');
    Insert into emp values ('8','HHHHH','FINANCE','14000');
    Insert into emp values ('9','IIIII','SALES','20000');
    Insert into emp values ('10','JJJJJ','FINANCE','21000');
    commit;
    Now I need a stored proc to create a .csv file in my local location. Please let me know If you need any other details....

    Some pointers:
    http://www.oracle-base.com/articles/9i/GeneratingCSVFiles.php
    http://tkyte.blogspot.com/2009/10/httpasktomoraclecomtkyteflat.html
    also, doing a search on this forum or http://asktom.oracle.com will give you many clues.
    .csv file in my local location.What is your 'local location'?
    A client machine? The database server machine?
    What database version are you using?
    (the result of: select * from v$version; )

  • Why I cannot open camera raw files from camera Nikon 3200 through Adobe Bridge and/or Photoshop?

    Why I cannot open raw files (NEF) from camera Nikon D3200 through Adobe Bridge and/or Photoshop ?

    Thanks for the tip. It worked out perfectly.
    De: ssprengel [email protected]
    Enviada em: terça-feira, 14 de janeiro de 2014 19:14
    Para: Aecio Paes Leme
    Assunto: Why I cannot open camera raw files from camera Nikon 3200 through Adobe Bridge and/or Photoshop?
    Re: Why I cannot open camera raw files from camera Nikon 3200 through Adobe Bridge and/or Photoshop?
    created by ssprengel <http://forums.adobe.com/people/ssprengel>  in Adobe Camera Raw - View the full discussion <http://forums.adobe.com/message/6012775#6012775

  • SSIS 2008 – Read roughly 50 CSV files from a folder, create SQL table from them dynamically, and dump data.

    Hello everyone,
    I’ve been assigned one requirement wherein I would like to read around 50 CSV files from a specified folder.
    In step 1 I would like to create schema for this files, meaning take the CSV file one by one and create SQL table for it, if it does not exist at destination.
    In step 2 I would like to append the data of these 50 CSV files into respective table.
    In step 3 I would like to purge data older than a given date.
    Please note, the data in these CSV files would be very bulky, I would like to know the best way to insert bulky data into SQL table.
    Also, in some of the CSV files, there will be 4 rows at the top of the file which have the header details/header rows.
    According to my knowledge I would be asked to implement this on SSIS 2008 but I’m not 100% sure for it.
    So, please feel free to provide multiple approaches if we can achieve these requirements elegantly in newer versions like SSIS 2012.
    Any help would be much appreciated.
    Thanks,
    Ankit
    Thanks, <b>Ankit Shah</b> <hr> Inkey Solutions, India. <hr> Microsoft Certified Business Management Solutions Professionals <hr> http://ankit.inkeysolutions.com

    Hello Harry and Aamir,
    Thank you for the responses.
    @Aamir, thank you for sharing the link, yes I'm going to use Script task to read header columns of CSV files, preparing one SSIS variable which will be having SQL script to create the required table with if exists condition inside script task itself.
    I will be having "Execute SQL task" following the script task. And this will create the actual table for a CSV.
    Both these components will be inside a for each loop container and execute all 50 CSV files one by one.
    Some points to be clarified,
    1. In the bunch of these 50 CSV files there will be some exception for which we first need to purge the tables and then insert the data. Meaning for 2 files out of 50, we need to first clean the tables and then perform data insert, while for the rest 48
    files, they should be appended on daily basis.
    Can you please advise what is the best way to achieve this requirement? Where should we configure such exceptional cases for the package?
    2. For some of the CSV files we would be having more than one file with the same name. Like out of 50 the 2nd file is divided into 10 different CSV files. so in total we're having 60 files wherein the 10 out of 60 have repeated file names. How can we manage
    this criteria within the same loop, do we need to do one more for each looping inside the parent one, what is the best way to achieve this requirement?
    3. There will be another package, which will be used to purge data for the SQL tables. Meaning unlike the above package, this package will not run on daily basis. At some point we would like these 50 tables to be purged with older than criteria, say remove
    data older than 1st Jan 2015. what is the best way to achieve this requirement?
    Please know, I'm very new in SSIS world and would like to develop these packages for client using best package development practices.
    Any help would be greatly appreciated.
    Thanks, <b>Ankit Shah</b> <hr> Inkey Solutions, India. <hr> Microsoft Certified Business Management Solutions Professionals <hr> http://ankit.inkeysolutions.com
    1. In the bunch of these 50 CSV files there will be some exception for which we first need to purge the tables and then insert the data. Meaning for 2 files out of 50, we need to first clean the tables and then perform
    data insert, while for the rest 48 files, they should be appended on daily basis.
    Can you please advise what is the best way to achieve this requirement? Where should we configure such exceptional cases for the package?
    How can you identify these files? Is it based on file name or are there some info in the file which indicates
    that it required a purge? If yes you can pick this information during file name or file data parsing step and set a boolean variable. Then in control flow have a conditional precedence constraint which will check the boolean variable and if set it will execute
    a execte sql task to do the purge (you can use TRUNCATE TABLE or DELETE FROM TableName statements)
    2. For some of the CSV files we would be having more than one file with the same name. Like out of 50 the 2nd file is divided into 10 different CSV files. so in total we're having 60 files wherein the 10 out of 60 have
    repeated file names. How can we manage this criteria within the same loop, do we need to do one more for each looping inside the parent one, what is the best way to achieve this requirement?
    The best way to achieve this is to append a sequential value to filename (may be timestamp) and then process
    them in sequence. This can be done prior to main loop so that you can use same loop to process these duplicate filenames also. The best thing would be to use file creation date attribute value so that it gets processed in the right sequence. You can use a
    script task to get this for each file as below
    http://microsoft-ssis.blogspot.com/2011/03/get-file-properties-with-ssis.html
    3. There will be another package, which will be used to purge data for the SQL tables. Meaning unlike the above package, this package will not run on daily basis. At some point we would like these 50 tables to be purged
    with older than criteria, say remove data older than 1st Jan 2015. what is the best way to achieve this requirement?
    You can use a SQL script for this. Just call a sql procedure
    with a single parameter called @Date and then write logic like below
    CREATE PROC PurgeTableData
    @CutOffDate datetime
    AS
    DELETE FROM Table1 WHERE DateField < @CutOffDate;
    DELETE FROM Table2 WHERE DateField < @CutOffDate;
    DELETE FROM Table3 WHERE DateField < @CutOffDate;
    GO
    @CutOffDate which denote date from which older data have to be purged
    You can then schedule this SP in a sql agent job to get executed based on your required frequency
    Please Mark This As Answer if it solved your issue
    Please Vote This As Helpful if it helps to solve your issue
    Visakh
    My Wiki User Page
    My MSDN Page
    My Personal Blog
    My Facebook Page

  • Cannot open nikon raw files from my 3200 in elements 8

    i cannot open nikon raw files from my 3200 in elements 8

    http://helpx.adobe.com/creative-suite/kb/camera-raw-plug-supported-cameras.html
    You would need PSE11 since the 3200 requires ACR version 7.1
    Only PSE11 can have 7.1, so to be able to use your raw files with PSE9 you should download the free Adobe DNGconverter which can batch convert a whole folder of raw files into the Adobe raw DNG format which your PSE8 can open.
    http://helpx.adobe.com/x-productkb/multi/troubleshoot-camera-raw-photoshop-photoshop.html
    http://www.adobe.com/support/downloads/detail.jsp?ftpID=5486
    http://www.adobe.com/support/downloads/detail.jsp?ftpID=5518

  • How to create .csv file from ABAP report

    Hi
    We have a requirement to generate .csv file from abap report.
    Currently user saves data from abap report to spreadsheet(.xls format) in desktop.  Then opens excel file and save as .csv format.  Need option to save directly in .csv format instead of .xls format.
    Please let me know, if there is any standard function module available to create .csv file.
    Regards
    Uma

    I tried with your code it's going to dump
    REPORT ZTEMP101 message-id 00.
    tables: lfa1.
    types: begin of t_lfa1,
          lifnr like lfa1-lifnr,
          name1 like lfa1-name1,
          end of t_lfa1.
    data: i_lfa1 type standard table of t_lfa1,
          wa_lfa1 type t_lfa1.
    types truxs_t_text_data(4096) type c occurs 0.
    data: csv_converted_table type table of TRUXS_T_TEXT_DATA.
    select-options: s_lifnr for lfa1-lifnr.
    select lifnr name1 from lfa1 into table i_lfa1
           where lifnr in s_lifnr.
    CALL FUNCTION 'SAP_CONVERT_TO_CSV_FORMAT'
    EXPORTING
       I_FIELD_SEPERATOR          = ';'
             I_LINE_HEADER              =
             I_FILENAME                 =
             I_APPL_KEEP                = ' '
      TABLES
        I_TAB_SAP_DATA             = I_LFA1
    CHANGING
       I_TAB_CONVERTED_DATA       = csv_converted_table
    EXCEPTIONS
       CONVERSION_FAILED          = 1
       OTHERS                     = 2
    IF SY-SUBRC <> 0.
    MESSAGE ID SY-MSGID TYPE SY-MSGTY NUMBER SY-MSGNO
            WITH SY-MSGV1 SY-MSGV2 SY-MSGV3 SY-MSGV4.
    ENDIF.
    CALL FUNCTION 'WS_DOWNLOAD'
    EXPORTING
      BIN_FILESIZE                  = ' '
      CODEPAGE                      = ' '
       FILENAME                      =
       'C:\Documents and Settings\ps12\Desktop\Test folder\exl.cvs'
      FILETYPE                      = 'DAT'
      MODE                          = ' '
      WK1_N_FORMAT                  = ' '
      WK1_N_SIZE                    = ' '
      WK1_T_FORMAT                  = ' '
      WK1_T_SIZE                    = ' '
      COL_SELECT                    = ' '
      COL_SELECTMASK                = ' '
      NO_AUTH_CHECK                 = ' '
    IMPORTING
      FILELENGTH                    =
      TABLES
        DATA_TAB                      = csv_converted_table
      FIELDNAMES                    =
    EXCEPTIONS
      FILE_OPEN_ERROR               = 1
      FILE_WRITE_ERROR              = 2
      INVALID_FILESIZE              = 3
      INVALID_TYPE                  = 4
      NO_BATCH                      = 5
      UNKNOWN_ERROR                 = 6
      INVALID_TABLE_WIDTH           = 7
      GUI_REFUSE_FILETRANSFER       = 8
      CUSTOMER_ERROR                = 9
      OTHERS                        = 10
    IF SY-SUBRC <> 0.
    MESSAGE ID SY-MSGID TYPE SY-MSGTY NUMBER SY-MSGNO
            WITH SY-MSGV1 SY-MSGV2 SY-MSGV3 SY-MSGV4.
    ENDIF.
    my version is 4.6c

  • Cannot import swf/air file from flash catalyst

    Cannot import swf/air file from flash catalyst, states error in code, there is nothing in the properties panel.

    Hi Kristen
    Have you tried using ActionScript 2? You might give that a go
    and see if things improve.
    Just a thought... Rick

Maybe you are looking for

  • My iPod isn't being used by another program!

    So I just got a new, 2nd gen green iPod shuffle today. And I updated it once and I got all my Kelly Clarkson songs on it. Then I synced some MCR songs on iTunes and I plugged my iPod back into the computer via USB dock. It kept on saying 'iPod is bei

  • How do I install legacy apps on my iPod touch with iOS3.1.3?

    I know this has probably been asked before, but I have an older iPod Touch with iOS 3.1.3, and this is the last iOS that could be installed. I am trying to load Pandora back onto it, I had it on before but I wiped the iPod when I loaned it to someone

  • Delivery costs with no material

    Hi gurus: I have a situation, i need to create a PO with Account assignment category "K" and with no material, only text description, and I put de  G/L account number    and cost Center. I want to calculate the delivery cost but in the transaction MI

  • Need help restoring original desktop after crash and new install

    Question: is there a way to restore my original desktop? My system starting giving me kernel errors. I was running Member when it crashed. I did a restart and the mouse would no longer work. Tried starting in safe mode. Mouse still refused to work. R

  • Use of time series functions with horizontally fragmented fact tables

    Hi Guys, in OBIEE 10g it wasn't possible to use time series functions [AGO, TO_DATE] on horizontally fragmented fact tables. This was due to be fixed in 11g. Has this been fixed? Has somebody used this new functionality? What the the limitations? Tkx