EIS data load via rules file

Hi,
PLease let me know the process to control the EIS data load into essbase using a rules file. I did not find the option in EIS. please help me.
Thanks,
pr

In EIS
1) You have to define the Logical OLAP Model connecting to the relational source.It defines the joins between fact table and dimension tables.
2) Based on the OLAP Model You have to create meta outline which defines the rules for loading members and data into essbase.

Similar Messages

  • EIS Data load

    Hi all,
    I am bit confused about loading data via EIS. We have two process one is to load data via rule file hisdld created by EIS and second is using user defined sql query. Are both these methods independent of each other or not?
    When i say independent i mean if we make some changes in user defined sql and load data are those changes first refelected in the rule file and then data load process occurs using this rule file or data load occurs just via sql quey we run
    It would be great if somebody could clarify the same.
    Thanks in advance!!

    Thanks for the reply.
    Yes i am talking about sql query within EIS so if i understood it right by your post if i make changes to that user defined sql code(within EIS ) it will make the changes in the data load rule file created by EIS i.e "hisdld". So If we talk of a data load process within EIS the User defined sql is used to create data load rule "hisdld" and hence they are dependant on each other.
    Edited by: user10859098 on Jan 29, 2009 10:54 AM
    Edited by: user10859098 on Jan 29, 2009 10:54 AM
    Edited by: user10859098 on Jan 29, 2009 10:56 AM

  • EIS data load slow

    Greetings all,
    Windows Server 2003 SP2. SQL Server 2000 SP3. Hyperion 9.3.1.
    I am running an EIS data load, and the load time is progressively longer depending on the number of records loaded.
    For example, when doing a data load of 1,000,000 records, it takes 2 minutes to load the first 500,000 records but an additional 10 minutes to load the next 500,000 records.
    Is there any reason for this, and can I do anything about it?
    Regards,
    Paul.

    Since non-Streaming mode is the only mode that EIS leverages to load data, it's not surprising that- Data loads are slow.
    If you want to continue using EIS for data loads, I believe- All you can do is- To refine/tune your User-defined SQL.
    You've mentioned about just few million records. However, in the volume-intensive environments where records are in the order of hundreds/thousands of millions, the majority would think of using EIS only for Metadata load & deferring data load to the flat file data extracts, which is way faster. At the same time, you'd have to consider if the extraction time required is not too much that it defeats the very purpose.
    - Natesh

  • Data loading from flat file to cube using bw3.5

    Hi Experts,
                       Kindly give  me the detailed steps with screens  about Data loading from flat file to cube using bw3.5
           ...............Please

    Hi ,
    Procedure
    You are in the Data Warehousing Workbench in the DataSource tree.
           1.      Select the application components in which you want to create the DataSource and choose Create DataSource.
           2.      On the next screen, enter a technical name for the DataSource, select the type of DataSource and choose Copy.
    The DataSource maintenance screen appears.
           3.      Go to the General tab page.
                                a.      Enter descriptions for the DataSource (short, medium, long).
                                b.      As required, specify whether the DataSource builds an initial non-cumulative and can return duplicate data records within a request.
                                c.      Specify whether you want to generate the PSA for the DataSource in the character format. If the PSA is not typed it is not generated in a typed structure but is generated with character-like fields of type CHAR only.
    Use this option if conversion during loading causes problems, for example, because there is no appropriate conversion routine, or if the source cannot guarantee that data is loaded with the correct data type.
    In this case, after you have activated the DataSource you can load data into the PSA and correct it there.
           4.      Go to the Extraction tab page.
                                a.      Define the delta process for the DataSource.
                                b.      Specify whether you want the DataSource to support direct access to data.
                                c.      Real-time data acquisition is not supported for data transfer from files.
                                d.      Select the adapter for the data transfer. You can load text files or binary files from your local work station or from the application server.
    Text-type files only contain characters that can be displayed and read as text. CSV and ASCII files are examples of text files. For CSV files you have to specify a character that separates the individual field values. In BI, you have to specify this separator character and an escape character which specifies this character as a component of the value if required. After specifying these characters, you have to use them in the file. ASCII files contain data in a specified length. The defined field length in the file must be the same as the assigned field in BI.
    Binary files contain data in the form of Bytes. A file of this type can contain any type of Byte value, including Bytes that cannot be displayed or read as text. In this case, the field values in the file have to be the same as the internal format of the assigned field in BI.
    Choose Properties if you want to display the general adapter properties.
                                e.      Select the path to the file that you want to load or enter the name of the file directly, for example C:/Daten/US/Kosten97.csv.
    You can also create a routine that determines the name of your file. If you do not create a routine to determine the name of the file, the system reads the file name directly from the File Name field.
                                  f.      Depending on the adapter and the file to be loaded, make further settings.
    ■       For binary files:
    Specify the character record settings for the data that you want to transfer.
    ■       Text-type files:
    Specify how many rows in your file are header rows and can therefore be ignored when the data is transferred.
    Specify the character record settings for the data that you want to transfer.
    For ASCII files:
    If you are loading data from an ASCII file, the data is requested with a fixed data record length.
    For CSV files:
    If you are loading data from an Excel CSV file, specify the data separator and the escape character.
    Specify the separator that your file uses to divide the fields in the Data Separator field.
    If the data separator character is a part of the value, the file indicates this by enclosing the value in particular start and end characters. Enter these start and end characters in the Escape Charactersfield.
    You chose the; character as the data separator. However, your file contains the value 12;45 for a field. If you set u201C as the escape character, the value in the file must be u201C12;45u201D so that 12;45 is loaded into BI. The complete value that you want to transfer has to be enclosed by the escape characters.
    If the escape characters do not enclose the value but are used within the value, the system interprets the escape characters as a normal part of the value. If you have specified u201C as the escape character, the value 12u201D45 is transferred as 12u201D45 and 12u201D45u201D is transferred as 12u201D45u201D.
    In a text editor (for example, Notepad) check the data separator and the escape character currently being used in the file. These depend on the country version of the file you used.
    Note that if you do not specify an escape character, the space character is interpreted as the escape character. We recommend that you use a different character as the escape character.
    If you select the Hex indicator, you can specify the data separator and the escape character in hexadecimal format. When you enter a character for the data separator and the escape character, these are displayed as hexadecimal code after the entries have been checked. A two character entry for a data separator or an escape sign is always interpreted as a hexadecimal entry.
                                g.      Make the settings for the number format (thousand separator and character used to represent a decimal point), as required.
                                h.      Make the settings for currency conversion, as required.
                                  i.      Make any further settings that are dependent on your selection, as required.
           5.      Go to the Proposal tab page.
    This tab page is only relevant for CSV files. For files in different formats, define the field list on the Fields tab page.
    Here you create a proposal for the field list of the DataSource based on the sample data from your CSV file.
                                a.      Specify the number of data records that you want to load and choose Upload Sample Data.
    The data is displayed in the upper area of the tab page in the format of your file.
    The system displays the proposal for the field list in the lower area of the tab page.
                                b.      In the table of proposed fields, use Copy to Field List to select the fields you want to copy to the field list of the DataSource. All fields are selected by default.
           6.      Go to the Fields tab page.
    Here you edit the fields that you transferred to the field list of the DataSource from the Proposal tab page. If you did not transfer the field list from a proposal, you can define the fields of the DataSource here.
                                a.      To define a field, choose Insert Row and specify a field name.
                                b.      Under Transfer, specify the decision-relevant DataSource fields that you want to be available for extraction and transferred to BI.
                                c.      Instead of generating a proposal for the field list, you can enter InfoObjects to define the fields of the DataSource. Under Template InfoObject, specify InfoObjects for the fields in BI. This allows you to transfer the technical properties of the InfoObjects into the DataSource field.
    Entering InfoObjects here does not equate to assigning them to DataSource fields. Assignments are made in the transformation. When you define the transformation, the system proposes the InfoObjects you entered here as InfoObjects that you might want to assign to a field.
                                d.      Change the data type of the field if required.
                                e.      Specify the key fields of the DataSource.
    These fields are generated as a secondary index in the PSA. This is important in ensuring good performance for data transfer process selections, in particular with semantic grouping.
                                  f.      Specify whether lowercase is supported.
                                g.      Specify whether the source provides the data in the internal or external format.
                                h.      If you choose the external format, ensure that the output length of the field (external length) is correct. Change the entries, as required.
                                  i.      If required, specify a conversion routine that converts data from an external format into an internal format.
                                  j.      Select the fields that you want to be able to set selection criteria for when scheduling a data request using an InfoPackage. Data for this type of field is transferred in accordance with the selection criteria specified in the InfoPackage.
                                k.      Choose the selection options (such as EQ, BT) that you want to be available for selection in the InfoPackage.
                                  l.      Under Field Type, specify whether the data to be selected is language-dependent or time-dependent, as required.
           7.      Check, save and activate the DataSource.
           8.      Go to the Preview tab page.
    If you select Read Preview Data, the number of data records you specified in your field selection is displayed in a preview.
    This function allows you to check whether the data formats and data are correct.
    For More Info:  http://help.sap.com/saphelp_nw70/helpdata/EN/43/01ed2fe3811a77e10000000a422035/content.htm

  • How to change SQL data source in rule file

    <p>Hi there,</p><p> </p><p>I use many rule files to load dimension and data with EssbaseSQL interface. (100-200 rule files!).</p><p>When need to cahnge information for data sources,  do Ihave to change it always with EAS interactively ?</p><p>Is there any way to cahnge the information of the data sourcesin batch mode or with command?</p>

    iitobs:<BR><BR>There are API calls (in at least C and VB that I found) that can use a rule file, and even copy or create one, but none that will modify a rule in any way. Even the "create rule" API call simply reserves the name so that someone else can't create one with that same name until you manage to copy it over from somewhere else - it doesn't create a new rule file, nor does it allow you to set anything in the rule. You have to use EAS to manage rule files internally.

  • Help Required regding: Validation on Data Loading from Flat File

    Hi Experts,
    I need u r help in the following issue.
    I need to validated the transactional data loading to the GL Cube from Flat file,
    1) The transactional data to the Cube to be loaded <b>only if master data</b> record exists for the <b>“0GL_ACCOUNT”</b> info object.
    2) If the master data record does not exits then the record need to be skipped from the loading and after the loading  the system should throw a message saying that these many records have been skipped (if there are any skipped records.).
    I would really appriciate u r help and suggestions on solving this issue.
    Regds
    Hari

    Hi, write a <b>start routine</b> in transfer rules like this.
      DATA: l_s_datapak_line type TRANSFER_STRUCTURE,
            l_s_errorlog TYPE rssm_s_errorlog_int,
            <b>l_s_glaccount type /BI0/PGLACCOUNT</b>,
            new_datapak type tab_transtru.
           refresh new_datapak.
           loop at datapak into l_s_datapak_line.
           select single * from /BI0/PGLACCOUNT into l_s_glaccount
             where CHRT_ACCTS eq l_s_datapak_line-<b>field name in transfer structure/datsource for CHRT_ACCTS</b>
    and GL_ACCOUNT eq l_s_datapak_line-<b>field name in transfer structure/datsource for GL_ACCOUNT</b>
    and OBJVERS eq 'A'.
           if sy-subrc eq 0.
             append l_s_datapak_line to new_datapak.
           endif.
           endloop.
           datapak = new_datapak.
           if datapak[] is initial.
    abort <> 0 means skip whole data package !!!
             ABORT = 4.
           else.
             ABORT = 0.
           endif.
    i have already some modifications but U can slightly change it to suit your need.
    regards
    Emil

  • Cube creation & Data loading from Flat file

    Hi All,
    I am new to BI 7. Trying to create a cube and load data from a flat file.
    Successfully created the infosource and Cube (used infosource as a template for cube)
    But got stucked at that point.
    I need help on how to create transfer rules/update rules and then load data into it.
    Thanks,
    Praveen.

    Hi
    right click on infosource->additional functions->create transfer rules.
    now in the window insert the fields you want to load from flat file->activate it.
    now right click on the cube->additional functions->create update rules->activate it.
    click on the small arrow on the left and when you reach the last node(DS)
    right click on it->create info package->extenal data tab->give your FLAT file path and select csv format->schedule tab->click on start.
    hope it helps..
    cheers.

  • EIS data load problem

    Hi,
    I use user defined sql to do a data load using EIS 7x
    When i run the sql in toad just to see the data I see 70000 rows
    But when I use the same sql in the userdefined sql and try to load data...I dont know why EIS says that it loaded around 69000 rows and also I see no rejections...
    I even made some sql changes to find out what are the records that are not being loaded..and I see some rows when I run a sql in toad and if i use the same in EIS its not loading those rows which I can see in toad for the same sql ...
    This is very strange.. Can any one help me with this,..??

    Glenn-
    I dont think there is anything unique about the missing rows..
    Actually the a part of code has been added to the previously working view(which i use to load data) to bring in some additional data...
    I took that part and tried to load data...and I see no data being loaded or rejected..
    It just says that records loaded "0"..rejected 0...
    but the same part brings in the required additional data... when exectued in toad..
    and about the excel sheet lock and send..I did that a week and as you said to my surprise it loads everything....
    This was the test bu which I figured out that EIS is not even able to load a part of data...and I found the exact part of data by going through it closely..
    So I think this is something to do with sql interface..
    And I did not understand the last lines in your post...
    /////I know when I do regular Sql interface some data types don;t load and I have to convert them to varchar tto get them to work. If the Excel file and Flat file loads work, look at that//////
    what did convert to varchar???
    Excel load works fine..So I think its something similar to what you mentioned in your post...
    Can you please explain it ...

  • CUNIT error in data loading from flat file after r/3 extraction

    Hi all
    After R/3 business content extraction, if i load data from flat file to info cube, I am getting Conversion exit CUNIT error, what might be the reason, the data in the flat file 0unit column is accurate, and mapping rules are also correct, but still i am getting error with CUNIT.?

    check your unit if you are loading amount or quantities what mapping you have and what you are loading from flat files.
    BK

  • Data Loads via an Integration Interface

    In our Demantra 7.3 environment, Open Orders, Pricing and Trade Spend all get loaded via Integration Interfaces (IIs). Those IIs get called after ep_load_main runs, and there is nothing in the workflow after those II calls.
    So how does the data get transferred from the BIIO tables into sales_data and/or mdp_matrix? Since there is nothing in the workflow to do that, I would expect to see triggers on the BIIO tables, but there aren't any.

    Hi,
    Data is loaded from BIIO tables into Sales_Data/Mdp_matrix using a workflow step 'Transfer Step' in Demantra.
    Transfer step takes the Import integration interface as input and load data from the corresponding BIIO tables into Demantra base tables.
    Please check if there is a transfer step in the workflow after ep_load_main run.
    Thanks,
    Rohit

  • Data load from variable file names

    I have multiple files that I want to load into a cube, each starting with the same 5 characters but ending differently. EG GM1010104 GM1010204 What's the best option for MAXL script to automate this data load ? Can you use a wildcard name in the script to pick up anything starting with GM101**** ?

    No - you need to specify the file name as it appears properly (I've never tried it but I am pretty sure it wouldn't work). One solution to this problem though is to have a shell script (or DOS commands) auto-generate an ESSCMD/MaxL script based on the files that exist in a directory.Most scripting environments should allow you to loop through a list of files that match some pattern - you can then create a script with the results and execute it.Another option is to build a MaxL script that accepts a parameter (file name) and have a shell script call it as it loops through the file list.Hope that helps.Regards,Jade----------------------------------Jade ColeSenior BI ConsultantClarity [email protected]

  • Data load fron flat file through data synchronization

    Hi,
    Can anyone please help me out with a problem. I am doing a data load in my planning application through a flat file and using data synhronization for it. How can i specify during my data synchronization mapping to add values from the load file that are at the same intersection instead of overriding it.
    For e:g the load files have the following data
    Entity Period Year Version Scenario Account Value
    HO_ATR Jan FY09 Current CurrentScenario PAT 1000
    HO_ATR Jan FY09 Current CurrentScenario PAT 2000
    the value at the intersection HO_ATR->Jan->FY09->Current->CurrentScenario->PAT should be 3000.
    Is there any possibility? I dont want to give users rights to Admin Console for loading data.

    Hi Manmit,
    First let us know if you are in BW 3.5 or 7.0
    In either of the cases, just try including the fields X,Y,Date, Qty etc in the datasource with their respective length specifications.
    While loading the data using Infopackage, just make the setting for file format as Fixed length in your infopackage
    This will populate the values to the respective fields.
    Prathish

  • Data Loading via Infopakge Vs UCBATCH01

    Hello Experts,
    I am new to BCS. I need to load data in BCS cube.Currently we have a process chain that loads data in BCS (Data stream load) .
    In that process chain it seems that is loaded using program UCBATCH01.
    There are sometimes when user need to load data manually ( like some adjustment entries). We are thinking to automate this process. So I have created a process chain for it, which picks data file from appln server to load to BCS.
    In this process chain I am thinking to use Infopackage which can load data to BCS using file from appn server.
    I am unable to understand if I should use UCBATCH01 program or I can use above said Infopackage ?
    Please advice.
    Thank you,
    Murtuza.

    Hello Experts,
    I am new to BCS. I need to load data in BCS cube.Currently we have a process chain that loads data in BCS (Data stream load) .
    In that process chain it seems that is loaded using program UCBATCH01.
    There are sometimes when user need to load data manually ( like some adjustment entries). We are thinking to automate this process. So I have created a process chain for it, which picks data file from appln server to load to BCS.
    In this process chain I am thinking to use Infopackage which can load data to BCS using file from appn server.
    I am unable to understand if I should use UCBATCH01 program or I can use above said Infopackage ?
    Please advice.
    Thank you,
    Murtuza.

  • Flat file data load (ASC II file format)

    Hi,
    We are currently in BI 7. And trying to load data for Flat file  (with a fixed data record length). As fixed length data load and we are ending up following error for quantity field.
    Flat file provides Quantity field length as 8. And quantity field declared as data type "quantity field, points to a unit field with format UN" in BI 7 system. If I put length as 17, dec 3 and external length as 23 in datasource level. Flat file contains only four fields and second field is quantity and 3rd, 4th are char. While loading it is considering 3rd char field into quantity and values are not populating properly in each field.
    If I put length as 7, dec 3 and external length as 8 in datasource level. I can able to get data properly but it is quantity can we declare as 7?
    If I put length as 17, dec 3 and external length as 8 in datasource level. I can not able to get data properly for each field.
    thank you very much for your time.

    Hi  Vikram Srivastava,
    Thank You very much for your time and I'm awarding points. So even If we declare as 7 and even it is quantity field there is no problem??

  • Hiearchy data loading from flat file to sap bw

    Hi experts,
    I am new to this can plz help me in sloving this sceniro.
    I have a sceniro like this can u tell me the procedure and hiearchy flat file structure
    To develop a data model in SAP BW to analyze sales
    MASTER DATA STRONG ENTITIES
    Create characteristics for following strong master data entities
    Customer
    Outlet
    Sales Office
    Sales Region
    Sales Representative
    Material
    Use Calendar Day and Calendar Month as time characteristics
    MASTER DATA WEEK ENTITIES
    Create attributes for following week master data entities
    Customer Name
    Customer Location
    Material Name
    Material Group
    ADDITIONAL MASTER DATA (HIEARCHIEY)
    Create a hierarchy where sales offices are assigned to sales regions
    and sales representatives are assigned to sales offices
    KEYFIGURES
    Quantity
    Price
    Tax %
    Sales Revenue
    DATA LOADING STARTEGY
    Load all master and transaction data using flat files.
    Thankq in advance
    Edited by: subbaraju on Dec 23, 2009 6:42 PM

    Hi arun
    Can u send me in detail the procedure how to slove the above sceniro.
    That is how many flat files we need to create. (cust f.f, mat f.f, heri f.f, trx f.f) i dont know wheather  it right r not.
    For tax and sales revenue wht are the formula we need to submit.
    and which one we need to take as master data key for hiearchery.
    Thanks in advance
    Edited by: subbaraju on Dec 24, 2009 7:05 PM

Maybe you are looking for

  • Using Apple´s host ?

    I am trying to make use of the space I have purchased with .mac .This is my website usage problem. 1. www.brilliantpix.com 2. Domain name linked to that iWeb made website 3. http://web.mac.com/cliffordshirley/Brilliantpix/Home.html 4. Independent sec

  • AR Invoice creation with BP Consolidation

    Hi, I have a number of deliveries.The customers for these deliveries have a consolidating BP.Now I want to create an AR Invoice from these deliveries to that consolidating BP.How this can be done by code?Can anybody give me a sample code? Deepesh

  • My MacBook is registered to another Apple ID.

    I have the newest 13"MacBook Pro Retina that I purchased from an individual. I went to the support page to add it to my devices, it says that this serial is registered to another Apple ID. I have no contact with the person I bought it from, any chanc

  • My iPhone 4s is not charging properly since updating to iOS 8

    I had been putting off updating my iPhone to iOS 8 after hearing much negative feedback about it. But I decided that updating it couldn't be too bad. Now my phone will only charge to around 40% while I'm sleeping (roughly 8 hours of not being used).

  • Help Apache not working since 10.4.10 upgrade

    Hello I have web sharing turned on and I run a small web page on my mac to host photos and such. It was working great until I upgraded to 10.4.10. Now nothing will connect to my web server at all. I have a power PC mac. When I turn on web sharing on