FDQM Sample data load file for HFM

Hi all,
I have just started working on FDQM 11.1.1.3. I had done integration of HFM application with FDQM. I required to load data to HFM application i don't have any gl file or csv file with hfm dimension . so any one can help me get this file.......
and one more thing.......
i just want to know what is the basic steps i need to perform to load data to HFM application after creating fdm application and integrating with hfm application.
Thanks.....

Hi,
After creating fdm application and integrating with hfm application also includes the Target Application name setting in FDM
Now the FDM application is ready with its target set to HFM application
1. You can create an Import format as below for the application using only Custom1 and Custom2 dimensions from 4 available Custom dimensions(you can modify according to your dimensions):
Use '|'(Pipe as delimeter)
Account|Account Desription|Entity|C1|C2|Amount
2. You can create a text file with data like mentioned below by making a combination of members from each dimension specified in import format:
11001|Capex|111|000|000|500000
11002|b-Capex|111|000|000|600000
Note: these dimension memers should be mapped in 'Mappings' section; use the members specified in file as source and select any target member of HFM for them
3. Then tag this import format with any location
4. Import the text file using 'Browse' option
5. Validate the data loaded
Note: If mapping is done for all dimension members used in file, then validation will be successful
6. Export the data
Now you can export and hence load the data to HFM application using Replace/Merge option.
Here you are with the basic steps to load the data from FDM to HFM.
Regards,
J

Similar Messages

  • Data Load file and Rule file Creation

    Hi,
    I have used to create Rule file and Data file for loading data into Essbase 7.1.6 version.
    Past two years I didnt work in Essbase and forget the option, field properities, data load file creation.
    Could you please advice me or any demo for creating rule file as well data files?.

    Two things I could suggest.
    1. look at the Sample.basic application it has dimension and data load rules for all sorts of scenarios
    2. Come to my session at Kaleidoscope Rules files beginning to advanced where I go over some of the more interesting things with rules files

  • I'm doing a scan around a line by sampling data 360 degrees for every value of z(z is the position on the line). So, that mean I have a double for-loop where I collect the data. The problem comes when I try to plot the data. How should I do?

    I'm doing a scan around a line by sampling data 360 degrees for every value of z(z is the position on the line). So, that mean I have a double for-loop where I collect the data. The problem comes when I try to plot the data. How should I do?

    Jonas,
    I think what you want is a 3D plot of a cylinder. I have attached an example using a parametric 3D plot.
    You will probably want to duplicate the points for the first theta value to close the cylinder. I'm not sure what properties of the graph can be manipulated to make it easier to see.
    Bruce
    Bruce Ammons
    Ammons Engineering
    Attachments:
    Cylinder_Plot_3D.vi ‏76 KB

  • Data loading sequence for 0FIGL_014

    Hi experts,
    Can you explain or brief the data loading sequence for 0FIGL_014?
    Regards.
    Prasad

    Hi,
    Following is my system configuration information
    Software Component      Release             Level                  Highest support                  Short Description
    SAP_BASIS                     620                    0058                   SAPKB62058                      SAP Basis Component
    SAP_ABA                        620                    0058                   SAPKA62058                      Cross Application Component
    SAP_APPL                       470                    0025                   SAPKH47025                       Logistics and Accounting
    SAP_HR                           470                    0030                   SAPKE47030                       Human Resources
    With the above configuration in R/3, I am not able to find the data source 0FI_GL_14.
    Can you please let me know what package to install and how to install?
    Regards.
    Prasad

  • Need data loader script for Buyer Creation for R12

    Hi ,
    Anybody has data loader script for Buyer creation in R12. Please copy paste one line in the reply on this message.
    Thanks

    Hi ,
    Anybody has data loader script for Buyer creation in R12. Please copy paste one line in the reply on this message.
    Thanks

  • Batch load file for folder

    hi,
    I am using ucm10.1.3.3.3, the question is can I use batch load file for folders?
    There seems only primaryFile in loading definition file, this means I can only define loading files one by one, is there a way to define loading folders.
    I have millions of files within about 100 folders, if no loading folder function is available, what's your suggestion on this case?
    Thanks!
    Best regards

    Hi,
    If you want to replicate the local folder-structure with documents into UCM, I don't think BatchLoader can do that!
    Then either drag-n-drop or write some custom component that creates IdcCommand file for folder-creation as well as checking in documents into UCM folder.
    Regards,
    Prateek

  • Check data load performance for DSO

    Hi,
        Please can any one provide the detials, to check the data load performance for perticular DSO.
       Like how much time it took to load perticular (e.g 200000) records in DSO from R/3 system. The DSO data flow is in BW 3.x version.
    Thanks,
    Manjunatha.

    Hi Manju,
    You can take help of BW statistics and its standard content.
    Regards,
    Rambabu

  • Data loading mechanism for flat file loads for hierrarchy

    Hi all,
    We have a custom hierarchy which is getting data from a flat file that is stored in the central server and that gets data from MDM through XI. Now if we delete few records in MDM, the data picked in BI will not consist of the records which are deleted. Does it mean that the hierarchy itself deletes the data it consists of already and does a full load or does it mean every time we load the data to the BI, do weu delete the records from the tables in BI and reload?
    Also we have some Web Service(gets loaded from XI) text data sources.
    Is the logic about updating the hierrarchy records different as compared to the existing web service interfaces?
    Can anyone please tell me the mechanism behind these data loads and differentiate the same for above mentioned data loads?

    create the ODS with the correct keys. And load full loads from the flat files. You can have a cube pulling data from the ODS.
    Load data in ODS
    Create the cube.
    Generate export datasource ( rsa1 > rt clk the ods > generate export Datasource )
    Replicate the export ds ( rsa1 > source system > ds overview > search the ds starting with 8 + the ODS name
    press the '+' button activate the transfer rules and comm str
    create the update rules for the cube with the above infource ( same as '8ODSNAME' Datasource )
    create infopackage with intial load (in the update tab)
    Now load data to cube
    Now load new full loads to ODS
    create a new infopackage for delta (in the update tab)
    run in infopackage. (any changes / new records will be loaded to cube)
    Regards,
    BWer
    Assing points if helpful.

  • How to automate the data load process using data load file & task Scheduler

    Hi,
    I am doing Automated Process to load the data in Hyperion Planning application with the help of data_Load.bat file & Task Scheduler.
    I have created Data_Load.bat file but rest of the process i am unable complete.
    So could you help me , how to automate the data load process using Data_load.bat file & task Scheduler or what are the rest of the file is require to achieve this.
    Thanks

    To follow up on your question are you using the maxl scripts for the dataload?
    If so I have seen and issue within the batch (ex: load_data.bat) that if you do not have the full maxl script path with a batch when running it through event task scheduler the task will work but the log and/ or error file will not be created. Meaning the batch claims it ran from the task scheduler although it didn't do what you needed it to.
    If you are using maxl use this as the batch
    "essmsh C:\data\DataLoad.mxl" Or you can also use the full path for the maxl either way works. The only reason I would think that the maxl may then not work is if you do not have the batch updated to call on all the maxl PATH changes or if you need to update your environment variables to correct the essmsh command to work in a command prompt.

  • BP 258 Sample Data Load

    Hello,
    I've configured the prereq's for the Best Practices for BPC MS 7.5 (V1.31) and am on BP 258 trying to upload the sample data provided (258_Y2010_Sales_Actual.txt).  When I run the DM package for IMPORT, it errors out with :
    Task Name : Convert Data
    Dimension  list          CATEGORY,P_CC,RPTCURRENCY,TIME,CUSTOMER,PRODUCT,SALESACCOUNT,AMOUNT
    [2010.004] Calculated member or invalid member          ACTUAL,SALESTEAM1,USD,2010.004,CUSTOMERA,PRODUCTA,SalesUnitPrice,-15000
    [2010.004] Calculated member or invalid member          ACTUAL,SALESTEAM1,USD,2010.004,CUSTOMERA,PRODUCTA,SalesQuantity,200
    [2010.004] Calculated member or invalid member          ACTUAL,SALESTEAM1,USD,2010.004,CUSTOMERA,PRODUCTA,TotalSalesRevenue,-3000000
    All I can figure from this limited info is that it does not like the conversion of the TIME dimension.  I checked my dim members and have time dim's covering the ones shown in the file (2010.004 for example).  The BP doc says nothing about maintaining the conversion file or a custom transformation file - it says to leave that empty to default to IMPORT.xls. 
    I just finished the BPC 410 course, but can't correlate what we did to this error (or I missed it). 
    Can anyone shed some light on this error? 
    thanks

    Thank you for responding.  I ran through all checks and all are OK to me. 
    1 - Optimize all green no errors
    2 - Yes I already checked this but am not 100% sure I am seeing what the system is seeing.  My dim members include the TIME items that exist in the sample data.  Only one value did not exist = members start on 2010.004 and sample data included 2010.001.  I removed those records, but it still fails.
    Here is the reject list:
    Task Name : Convert Data
    [Dimension:  TIME]
    2010.007
    2011.002
    2010.005
    2010.008
    2011.001
    2010.006
    2011.003
    2010.011
    2010.004
    2010.001
    2010.012
    2010.009
    This tells me (an inexperienced BPC person) there is a problem in conversion of the external to internal data for that dimension.  However, I have already validated that they are equal - I can't see how 2010.004 does not equate to 2010.004, unless it is comparing to the EVDESC and not the ID (2010.004 versus 2010 APR).  Am I correct on that assumption? 
    3 - Yes I've also tried changing to a new transformation file with conversion and delimiters differently mapped, but all the same result. I'm sure I am missing something trivial here!  So I do appreciate any help to figure that out. 
    Thanks again

  • Data load error for Init.

    Hello Gurus,
    I am having some problme in loading Init,
    In data load i am getting following error messages,
    1,System error occurred (RFC call)
    2,Activation of data records from ODS object ZDSALE01 terminated
    3,No confirmation for request ODSR_45F46HE2FET0M7VFQUSO2EHPZ when activating the ODS object ZDSALE01
    4Request REQU_45FGXRHFHIKAEMXL3D7HU6OFR , data package 000001 incorrect with status 5
    5,Request REQU_45FGXRHFHIKAEMXL3D7HU6OFR , data package 000001 not correct
    6,Inserted records 1- ; Changed records 1- ; Deleted records 1-
    Please help me in resolving these errors.

    Hi,
    are you loading a flat file? If yes check if the file is available at the specified path and if you/sap has the authority to access that path/file.
    regards
    Siggi

  • Maxl Error during data load - file size limit?

    <p>Does anyone know if there is a file size limit while importingdata into an ASO cube via Maxl. I have tried to execute:</p><p> </p><p>Import Database TST_ASO.J_ASO_DB data</p><p>using server test data file '/XX/xXX/XXX.txt'</p><p>using server rules_file '/XXX/XXX/XXX.rul'</p><p>to load_buffer with buffer_id 1</p><p>on error write to '/XXX.log';</p><p> </p><p>It errors out after about 10 minutes and gives "unexpectedEssbase error 1130610' The file is about 1.5 gigs of data. The filelocation is right. I have tried the same code with a smaller fileand it works. Do I need to increase my cache or anything? I alsogot "DATAERRORLIMIT' reached and I can not find the log filefor this...? Thanks!</p>

    Have you looked in the data error log to see what kind of errors you are getting. The odds are high that you are trying to load data into calculated memebers (or upper level memebers) resulting in errors. It is most likely the former. <BR><BR>you specify the error file with the <BR><BR>on error write to '/XXX.log'; <BR><BR>statement. Have you looked for this file to find why you are getting errors? Do yourself a favor, load the smaller file and look for the error file to see what kind of an error you are getting. It is possible that you error file is larger than your load file, since multiple errors on a single load item may result in a restatement of the entire load line for each error.<BR><BR>This is a starting point for your exploration into the problem. <BR><BR>DATAERRORLIMIT is set at the config file, default at 1000, max at 65000.<BR><BR>NOMSGLOGGINGONDATAERRORLIMIT if set to true, just stops logging and continues the load when the data error limit is reached. I'd advise using this only in atest environement since it doesn't solve the initial problem of data errors.<BR><BR>Probably what you'll have to do is ignore some of the columns in the data load that load into calculated fields. If you have some upper level memebers, you could put them in skip loading condition. <BR><BR>let us know what works for you.

  • Data Load process for 0FI_AR_4  failed

    Hi!
    I am aobut to implement SAP Best practices scenario "Accounts Receivable Analysis".
    When I schedule data load process in Dialog immediately for Transaction Data 0FI_AR_4 and check them in Monitor the the status is yellow:
    On the top I can see the following information:
    12:33:35  (194 from 0 records)
    Request still running
    Diagnosis
    No errors found. The current process has probably not finished yet.
    System Response
    The ALE inbox of BI is identical to the ALE outbox of the source system
    or
    the maximum wait time for this request has not yet been exceeded
    or
    the background job has not yet finished in the source system.
    Current status
    No Idocs arrived from the source system.
    Question:
    which acitons can  I do to run the loading process succesfully?

    Hi,
    The job is still in progress it seems.
    You could monitor the job that was created in R/3 (by copying the technical name in the monitor, appending "BI" to is as prefix, and searching for this in SM37 in R/3).
    Keep on eye on ST22 as well if this job is taking too long, as you may have gotten a short dump for it already, and this may not have been reported to the monitor yet.
    Regards,
    De Villiers

  • How to identify the maximum data loads done for a particular day?

    Hi all,
    There is huge volume of data been loaded on a last monday, and then data was deleted from the cubes on the same day. And hence i needs to see which are all the cubes which was loaded wtih lot of records for a particular day,
    i happened to look at rsmo, i am unable to see the ods nor the cube data loads,
    were do i seet it?
    Thanks

    See if the table RSSELDONE helps. This will give you the recent data load details. Based on those loads , you can search the targets.
    And also check table TBTCO  which will give the latest job details. You will have to analyze the same jobs to know what loads were done . Give a selection for date.

  • Data load process for FI module

    Dear all,
    We are using BI7.00 and in one of our FI data source 0EC_PCA_1 we had data load failure, the cause for the failure was analysed and we did the following
    1) deleted the data from cube and the PSA
    2) reloaded (full load) data - without disturbing the init.
    This solved our problem. Now when the data reconciliation is done we find that there are doubled entries for some of the G/L codes.
    I have a doubt here.
    Since there is no setup table for FI transactions (correct me if i am wrong), the full load had taken the data which was also present in the delta queue and subsequently the delta load had also loaded the same data
    (some g/l which was available as delta).
    Kindly provide the funtioning of FI data loads. Should we go for a Down time and how FI data loads works without setup tables.
    Can experts provided valuable solution for addressing this problem. Can anyone provide step by step process that has to be adopted to solve this problem permenantly.
    Regards,
    M.M

    Hi Magesh,
    The FI datasources do not involve Setup tables while performing full loads and they do not involve outbound queue during delta loads.
    Full load happens directly from your datasource view to BI and delta is captured in the delta queue.
    Yes you are right in saying that when you did a full load some of the values were pulled that were also present in the delta queue. Hence you have double loads.
    You need to completely reinitialise as the full load process is disturbed. Taking a down time depends on how frequent the transactions are happening.
    You need to.
    1. Completely delete the data in BW including the initialisation.
    2. Take a down time if necessary.
    3. Reintialise the whole datasource from scratch.
    Regards,
    Pramod

Maybe you are looking for

  • Passing multiple single values from R/3 BAPI to BW Query in VC

    Hi Experts I have a VC model where I am calling an R/3 BAPI which is returning a table of customer accounts.  I'm trying to go directly from that BAPI to a BW Query and pass the table of accounts into the BW Query. The BW Query has been set up with a

  • Partner Profile for file to IDOC scenario

    Hi experts, I am doing file to Idoc scenario. I did the ALE configuration steps between XI and r/3. For sender (ThirdParty) system, I defined Business system. While defining partner profiles in WE20 I got confused, whether to Give XI logical system N

  • Error message after installing Acrobat 8.1 in Vista

    I bought the Technical Communication Suite in June 2008, and installed it on my old computer, which was Windows XP. Now my old computer has died, and I just bought a new Dell Studio with Windows Vista 64 bit. So now I'm trying to install in the new c

  • Can I sort "All Photos" by date ?

    Photos App on Mac appears to have no way to sort the album "All Photos" by the date the photo was taken, or by file name.  What am I missing?

  • Question on closing cockpit

    Can any one tell me the Work of consultants and endusers of closing cockpit functionality in sap financials as per ERP 6.0 version? few closing cockpit acivities also..? Warm Regards, Arjun Edited by: neradias on Aug 28, 2011 6:02 AM