HFM data extract for Multiload

Very new to this and need some help please. When extracting HFM data for FDM multiload what is the best way to get the data for periods into columns and not rows without formatting the data manually?

The data extract interface is limited and does not allow you to have multicolumns for the periods. What you can try is to create a smartview (if the number of rows is not too large) and the save the file as a CSV, then in a text editor replace all commas by semicolons. Otherwise, if you can write Excel VBA code then this could also be an alternative again number of lines could be a potential issue.
If you don't have to run it through translation tables, you can probably accomplish must of the changes using a strong text editor such as K-edit.

Similar Messages

  • HFM Data Extract in Task Flow ??

    HI,
    I have a couple of clarification on HFM Data extract in HFM Task flow,
    1) Is Data extraction possible for the predefined accounts in HFM task flow? – in which Accounts & other dimesion memebers needs to be pre-defined so that just on execution of the task flow, HFM needs to extract the data into a text file (i.e., It should not prompt to select the Accounts (as well as other dimension members) each time we extract)
    2) For Data extract purpose, we need some of the HFM members in a different name. So, we planned to create a alias table. In data extraction, Is it possible to select the member names from an custom alias table?
    3) We have to define a specific set of Account values as negative in the extracted data file which is actually positive in HFM. Also, In the extracted data file, We need to have the Account member name same as the existing HFM Account name.
    a.     For instance,
    In HFM:
    Account Member Name in HFM: VUK
    Account Value: 1000
    In the Extracted data file:
    Account Member Name in data file: VUK
    Account Value: -1000
    Is this feasible? If yes, can you explain a bit in detail (like how we can implement this in HFM)
    Thanks,
    Siva

    HI,
    I need a help in HFM Data extraction.
    We use four custom dimensions and our requirement is to get the data extract for the custom members which we need(we need only parent members in custom dimensions).
    But, by default, HFM extracts the data for all base custom members(and not for any parent custom members) - which we dont wish to have.
    Is there any possiblty to extract the data by choosing the custom members for which we need data ??
    Our's is a classic HFM Application in 9.3.1. (in Oracle DB)
    Your response will be highly appreciated !!!
    Thanks,
    Siva

  • Error in data extraction for 0TCT_DS22

    Hi All .
    I am facing issue in data extraction for 0TCT_DS22 , its giving message 'error in source system ' and also giving a dump
    Runtime Errors TSV_TNEW_PAGE_ALLOC_FAILED
    Date and Time 24.05.2010 11:22:53
    Short text
    No more storage space available for extending an internal table.
    What happened?
    You attempted to extend an internal table, but the required space was
    not available.
    As per following thread i updated the table but still data load is failing ( Our system is at EHP1)
    DataSource 0TCT_DS22 not extracting Data
    Any pointers what could be the solution for this data load failure .
    Thanks.

    Hi Neetika,
           This type of error occures due to lack of memory in server or no process is available for request process.
    Please refer below forum.
    Re: Runtime Errors  TSV_TNEW_PAGE_ALLOC_FAILED
    Have you done any custom coding? if yes then check the loop.. endloop statement. (not closing loop)
    Thanks & Regards,
    Ashish

  • Data extraction for BW/BI

    Hi ,
    I am new in BW.Can anyone send me material on DATA EXTRACTION IN BW? I mainly want the material for LO-extraction.If anyone could provide the material on extractors like LO-cockpit,Generic data source etc.I will be really thankful.Please send the material at  "baljinder4u_gmail.com" .
    Thanks in advance

    Hi Rakesh
    Step-by-step control flow for a successful data extraction with SAP BW:
       1.  An InfoPackage is scheduled for execution at a specific point of time or for a certain system- or user-defined event.
       2.  Once the defined point of time is reached, the SAP BW system starts a batch job that sends a request IDoc to the SAP source system.
       3.  The request IDoc arrives in the source system and is processed by the IDoc dispatcher, which calls the BI Service API to process the request.
       4.  The BI Service API checks the request for technical consistency. Possible error conditions include specification of DataSources unavailable in the source system and changes in the DataSource setup or the extraction process that have not yet been replicated to the SAP BW system.
       5.  The BI Service API calls the extractor in initialization mode to allow for extractor-specific initializations before actually starting the extraction process. The generic extractor, for example, opens an SQL cursor based on the specified DataSource and selection criteria.
       6.  The BI Service API calls the extractor in extraction mode. One data package per call is returned to the BI Service API, and customer exits are called for possible enhancements. The extractor takes care of splitting the complete result set into data packages according to the IDoc control parameters. The BI Service API continues to call the extractor until no more data can be fetched.
       7.  The BI Service API finally sends a final status IDoc notifying the target system that request processing has finished (successfully or with errors specified in the status IDoc).
    Note
    Control parameters specify the frequency of intermediate status IDocs, the maximum size (either in kilobytes or number of lines) of each individual data package, the maximum number of parallel processes for data transfer, and the name of the application server to run the extraction process on.
    *Here is LO Cockpit Step By Step*
    LO EXTRACTION
    - Go to Transaction LBWE (LO Customizing Cockpit)
    1). Select Logistics Application
           SD Sales BW
                Extract Structures
    2). Select the desired Extract Structure and deactivate it first.
    3). Give the Transport Request number and continue
    4). Click on `Maintenance' to maintain such Extract Structure
           Select the fields of your choice and continue
                 Maintain DataSource if needed
    5). Activate the extract structure
    6). Give the Transport Request number and continue
    - Next step is to Delete the setup tables
    7). Go to T-Code SBIW
    8). Select Business Information Warehouse
    i. Setting for Application-Specific Datasources
    ii. Logistics
    iii. Managing Extract Structures
    iv. Initialization
    v. Delete the content of Setup tables (T-Code LBWG)
    vi. Select the application (01 u2013 Sales & Distribution) and Execute
    - Now, Fill the Setup tables
    9). Select Business Information Warehouse
    i. Setting for Application-Specific Datasources
    ii. Logistics
    iii. Managing Extract Structures
    iv. Initialization
    v. Filling the Setup tables
    vi. Application-Specific Setup of statistical data
    vii. SD Sales Orders u2013 Perform Setup (T-Code OLI7BW)
            Specify a Run Name and time and Date (put future date)
                 Execute
    - Check the data in Setup tables at RSA3
    - Replicate the DataSource
    Use of setup tables:
    You should fill the setup table in the R/3 system and extract the data to BW - the setup tables is in SBIW - after that you can do delta extractions by initialize the extractor.
    Full loads are always taken from the setup tables
    please follow the link
    https://www.sdn.sap.com/irj/sdn/advancedsearch?query=dataEXTRACTIONIN+BW&cat=sdn_all
    Regards
    Tapashi
    Edited by: Tapashi Saha on Aug 18, 2008 11:03 AM

  • HFM data extract via Web

    We are using 11.1.2.1 on windows 2008 R2 servers.
    Is there in HFM anything that limits the file size when you're doing a metadata or data export? Maybe not a file size limitation, but may a time limitation or something? I know when I tried to do an export of the application via the HFM client, I got the entire file. When I did it via the web, I could tell I got about half of the file (based on the file size) and I think it gave me the last half of the file since it didn't start with the correct file tags. I typically don't do any of my extracts via the web (I do them through the client), but many of our users do. If there is some kind of limitation within Hyperion do you think we could increase it? We have a user who didn't get a complete file via the web.

    Not sure if this is your problem or not, but when we upgraded to 11.1.2.1.103 I had a similar issue where I would only get 1/2 my metadata extract. Maybe try this and see if it helps:
    Metadata/Data Exports
    You will need to set the AspResponseBufferLimit by performing the following steps.
    By default, the value for the ASPBufferLimit property in IIS 6 and for the bufferLimit property in IIS 7 is 4,194,304 bytes (4 MB).
    The recommendation is to set the value to 1073741824 (1 GB).
    To increase the buffering limit in IIS 6, follow these steps:
    1. Click Start, click Run, type cmd, and then click OK.
    2. Type the following command, and then press ENTER:
    cd /d %systemdrive%\inetpub\adminscripts
    3. Type the following command, and then press ENTER:
    cscript.exe adsutil.vbs SET w3svc/aspbufferinglimit LimitSize
    Note: LimitSize represents the buffering limit size in bytes. For example, the number 1073741824 sets the buffering limit size to 1 GB.
    To increase the buffering limit in IIS 7, follow these steps:
    1. Click Start, click Run, type inetmgr, and then click OK.
    2. In the left hand panel, expand the computer name by clicking the plus sign next to it.
    3. Expand the “Sites”.
    4. Expand “Default Web Site”
    5. Select the HFM web site in the expanded list.
    6. In the middle panel “Features View”, in the IIS section, double click the ASP icon.
    7. Expand “Limits Properties”.
    8. Change “Response Buffering Limit” to 1073741824.
    9. In the right hand panel, select “Apply”.
    10. Restart IIS.

  • Purchase Order data extraction for Service PO

    Hi Experts,
    I am facing problem for extraction of Service Po data .For extractor 2LIS-02_ITM which will give the item details for PO. for the same  I am not geting service Number .
    In ITEM data its show only its service type PO . but not giving service number.
    How to do this ??? plz send reply
    Thanks
    sharad

    Hi Sharad,
    You can enhance the extract structure for 2LIS_02_ITM by adding a new field.
    To populate this field, please try the below logic in user exit code. I am not sure which field you need from ESLL. I am assuming the activity number which is SRVPOS.
    1. Select ESSR-PACKNO into temp where ESSR-EBELN = Pur Doc no in extract
                                                           and  ESSR-EBELP = Pur Doc item in extract.
    2. Select ESLL-SRVPOS where ESLL-PACKNO = temp(retrieved from above step).
    I was not able to find any direct link between Purchase doc number and ESLL, that is why used table ESSR to get the packno.
    Hope this helps.
    Thanks,
    Archana

  • Problem in Data extraction for NEW GL DSO 0FIGL_O10

    Hi ,
    I am facing Problem in extraction of records from SAP to BW.
    I have installed Business Content of NEW GL DSO  0FIGL_O10.
    When I extract the Data from SAP R/3, to this DSO  ( 0FIGL_O10 )  the reocrds are getting over written
    For Example  When I go the the Mange Option ( InfoProvider Administration)  the transferred Records and the Added Records are not same.  The Added records are less then the Transfered reocords.
    This is happening becuase of Key Filed Definations.
    I have 16 Characterisics in the KEY FIELD, which the maximum that I can have. But the Data comming from is Unique in some casses.
    As result the data get added up in the DSO, hence my balances are not matching with SAP R/3 for GL Accounts.
    There are total 31 Characteristics in the Datasource (0FI_GL_10) . Of which 16 Charactheristics i can include in the Key field area.
    Please suggest some solution.
    Regards,
    Nilesh Labde

    Hi,
    For safety, the delta process uses a lower interval setting of one hour (this is the default setting). In this way, the system always transfers all postings made between one hour before the last delta upload and the current time. The overlap of the first hour of a delta upload causes any records that are extracted twice to be overwritten by the after image process in the ODS object with the MOVE update. This ensures 100% data consistency in BW.
    But u can achive ur objective in different manner::
    Make a custom info object ZDISTINCT and populate it in transformation using ABAP code. In ABAP try and compound the values from different charactersitcs so that 1 compounded characterstic can be made. Use ZDISTINCT in ur DSO as key
    Just a thought may be it can solve ur problem.
    Ravish.

  • Solution Manager Data Extraction for BW

    Hi,
    We have a Solution Manager in place for SAP Systems. Our requirement is to extract data from Solution Manager into BW and provide  BW Reports for Solution Manager Reporting & analysis.
    What are the standard DataSources/Extractors available in solution manager for this?
    What are the Business Contents (Infosources) available in BW?
    Any help will be rewarded.
    Regards
    Prasad

    Hi Prasad,
    have you already found the related IMG tree? It guides you through the setup.
    Transaction SPRO -> SAP Reference IMG -> SAP Solution Manager -> Scenario-Specific Settings -> Operations -> BI Reporting
    transaction: RSA5 (Installation of DataSource)
    -  0SM_SMG_ROOT                   Oberster Knoten Solution Manage
    -- 0SM_SMG                        Solution Manager
    --- 0SM_DSWPBI_DB390
    --- 0SM_DSWPBI_DB400                  DB400 Data
    --- 0SM_DSWPBI_DBADA                  Adabas Data
    --- 0SM_DSWPBI_DBDB2                  DB2 Data
    --- 0SM_DSWPBI_DBINF                  Informix Data
    --- 0SM_DSWPBI_DBMSS                  SQL Server Data
    --- 0SM_DSWPBI_DBORA                  Oracle Data
    --- 0SM_DSWPBI_MODUL                  Module Data
    --- 0SM_DSWPBI_PERF                   Performance Data
    --- 0SM_DSWPBI_SERVER                 Server Data
    --- 0SM_DSWPBI_SYSDATA                System Details Data
    --- 0SM_DSWPBI_SYSTEM                 System Data
    Best regards,
    Ruediger Stoecker

  • Real-time data extraction for FI-GL

    Hi All,
    I have a business requirement to have FI-GL transactions updated in 5 minutes or less in our BI system.
    As per OSS Note 991429 (Minute based extraction enhancement for 0FI * 4 extractors) it seems in theory possible. However, the settings in this note inidcate that the extraction can only happen every 3600 seconds (1h) or more to avoid losing any delta records.
    Has anybody been able to use either datasource 0FI_GL_4  or any other standard datasource to set-up a real-time extraction (the key point being a real-time update of 5 minutes or less, not an hour)?
    Thanks a lot for your help.
    David

    I would like to inform you, that SAP does not recommend to reduce the
    safety interval from 1 hour to 15 minutes.
    This setting can cause missing records in the BW system.
    However it is on your OWN risk, and it is NOT supported by SAP, if it
    will cause missing datas in BW.
    You can change the safety interval in table BWOM_SETTINGS.
    Please take a detailed look at the note 991429.
    Colin

  • Custom Dimension sql names in version 11.1.2 HFM Data Extract

    Does someone know how to reference the Custom Dimensions in Hyperion 11.1.2 in Hyperion 9.3 they were custom1_item, Custom2_item,custom3_item, and custom4_item but I am not sure what they are now called? any help would be appreciated.

    I'm not clear on what you're after, but I think your question is about the item tables in HFM's database? If so, the custom dimension table is now appname_custom_item for all of the custom dimensions. Each custom is then distinguished by LDIMID field, which contains the custom dimension's number "1", "2", "3", "4". I've not examined the same for 11.1.2.2 for an application having more than four dimensions.
    --Chris                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                       

  • Data extraction for debit/credit indicator A & L

    Hello,
    In ECC there are certain records agains which Debit/Credit Indicators
    are A & L...How can we pull them into BW???
    Also we are getting Amount againt those records in BW but not Quantity..
    Thanks & Regards
    Shilpi Gupta

    Hi Shilpi,
                    Check your transformation / Transfer rules for quantity.
    if quanity is not mapped map quantity field from data source to your bw target field.
    hope this will help you.
    Thanks,
    Vijay.

  • Data Extraction for a Project Created in cProject

    Hi,
       I have created a project with phases and tasks in cProjects. I am trying to write a listing report. But i am unable to find tables which contains all the details of the project. I am able to get only the project header detail. But i am not able to get the phase and tasks details. Please help.
    Regards,
    Srikanth

    Please check the PROJ table .. there is a data element PSPID  for project id..

  • Error in Data Extraction for Asset Accouting Data Source

    Hi all,
    We have install the new flow of the Asset Accounting i.e 0FI_AA_20 data source. We are trying to load the data in BW. The problem is the volume of data is so high. Around 35 lacs records.
    I am getting below error while loading in the BW.
    "Error 7 when sending an IDoc"
    "No more storage space available for extending an i 3 "
    Please suggest..
    Regards,
    Macwan James.

    Hi Macwan James,
    Not only the requests from DS to which you are loading data. You can delete the requests from other DS--> manage screens also if they are no longer required.
    Or else use selection screen of Infopackage to load the data. Change the selections & try to load the data.
    Hope it helps!
    Regards,
    Pavan

  • Issue with Data Extraction for Master data

    Hi All,
    I have requriment to enhance standard Datasource(0VENDOR_ATTR) to populate certain fields from table LFBK.
    I have enhanced the datasource and the i am able to pull the data in ECC via RSA3.
    After that i have replicated the Datasource in BI and have activated the Datasource and the enhanced fields are available in BI datasources and have created the transformation between Datasource and 0VENDOR.
    when i run the infopackage, the new enhanced fields are not getting populated. I really wonder how the data is not coming to BI, when its is getting populated in RSA3 in ECC.
    Request to give your solution to solve this issue.
    Regards,
    Gowrisankar

    Hello Gowrisankar N K ,
    After making changes to your 0VENDOR_ATTR DS, make sure that you had replicate DS at BI side.
    if not then please first Replicate the DS & check the newly enhanced field is available in PSA.
    Regards,
    Divyesh Khambhati

  • Steps for Data extraction from SAP r/3

    Dear all,
    I am New to SAP Bw.
    I have done data extraction from Excel into SAP BW system.
    that is like
    Create info objects > info area> Catalog
                                                --> Character catalog
                                                --> Key catalog
    Create info source
    Upload data.
    create info cube
    I need similar steps for data extraction for SAP R/3
    1. when data is in Ztables ( using Views/Infosets/Function etc)
    2. When data is with Standard SAP using Business Content.
    Thanks and Regards,
    Gaurav Sood

    hi,
    chk the links
    Generic Extraction
    https://www.sdn.sap.com/irj/sdn/go/portal/prtroot/docs/library/uuid/84bf4d68-0601-0010-13b5-b062adbb3e33
    CO-PA
    http://help.sap.com/saphelp_46c/helpdata/en/7a/4c37ef4a0111d1894c0000e829fbbd/content.htm
    CO-PC
    https://www.sdn.sap.com/irj/sdn/go/portal/prtroot/docs/library/uuid/fb07ab90-0201-0010-c489-d527d39cc0c6
    iNVENTORY
    https://www.sdn.sap.com/irj/sdn/go/portal/prtroot/docs/library/uuid/f83be790-0201-0010-4fb0-98bd7c01e328
    Extractions in BI
    https://www.sdn.sap.com/irj/sdn/wiki
    LO Extraction:
    /people/sap.user72/blog/2004/12/16/logistic-cockpit-delta-mechanism--episode-one-v3-update-the-145serializer146
    /people/sap.user72/blog/2005/01/19/logistic-cockpit-delta-mechanism--episode-three-the-new-update-methods
    /people/sap.user72/blog/2005/02/14/logistic-cockpit--when-you-need-more--first-option-enhance-it
    /people/sap.user72/blog/2004/12/23/logistic-cockpit-delta-mechanism--episode-two-v3-update-when-some-problems-can-occur
    /people/sap.user72/blog/2005/04/19/logistic-cockpit-a-new-deal-overshadowed-by-the-old-fashioned-lis
    Remya

Maybe you are looking for

  • Itunes can't see my podcasts

    This has only happened since the last software update. It's REALLY annoying. I have fiddled with just about every setting, option and preference and the podcast library screen just shows me "What's new". I've played podcasts, downloaded new ones (whi

  • Restart Data Services 3.2 fails

    I can not access  trace log, monitor log and error log files. Every time accessing through links I get a time out error. The Server is available and accessible. I just tried to restart the BusinessObjects Data Service - Server through the BO-DS-Serve

  • IPSec with two Cisco RV220W's

    I have two Cisco RV220W's. FTP over my VPN is so slow, that I have to slow down the FTP Transfer to about 10kbps in order to keep the tansfer steady. Trying to move TB's of information at that speed is not reasonable. What will resolve this issue?

  • Purchased Mountain Lion Download Failure!

    Dears! One week ago I purchased a Mountain Lion X from App Store to upgrade my OS X Lion 10.7.4 (11ES3) on my iMac 27-inch, Mid 2011 and since, App Store didn't succeed to download it! "An error has occurred" message in RED color is shown in "Purchas

  • Normalize in the bounce section?

    when you go to bounce a tune, what exactly does that normalize function do if you turn it one? and yes I am experiencing sluggish issues, sometimes even to move a plugin, never had that before, but i am running a bunch of 3rd party