REFRENTIAL INTEGRITY IN BW Cubes.

Hi,
I have a question regarding referential integrity between Transaction data and master data.
I have created a BasicCube structure with one of the dimension as Customer. I am loading Customer master data and Transaction data from an external flat file. In Transaction data file there is a record with customer number 100 but the master data file does not contain this customer number 100 record. How and at what level will the systems check for this sort of referential integrity. Will it gives me an error during loading of Master data or Transaction data? What is a general procedure to load master data and transactional data, do we need load master data first or transactional data?
Thanks.
SAU.

Hi and welcome to the SDN,
please do a search for your issue on the forums, we already had that discussion a few times.
In general, it is always suggested to load master data first in order to have a good performance for loading the transactional data and specially in case you want to derive some other data depending on master data. While loading transactional data you can make a setting in the infopackage for having the master data checked. If you load the data without validity check, the system will create a basic entry (in your case customer 100 and assign a sid) so that you will be able to load the master data later on (only the p-table will be updated in that case). If you load it with validity check, the system will raise an error while loading the data and you have to load master data first.
Hope this helps!
regards
Siggi

Similar Messages

  • Refrential Integrity in BW7.0

    Hi,
    In BW 3.5 we have the option to check the refrential integrity in the communication structure.
    do we have such kind of option in BI7.0,if so could you pelase tell me where it is.
    points will be assigned.
    thanks,
    Subha

    you can find it in  transformations. 4th column.

  • Refrential integrity between 2 oracle DB

    I have to create a foreign key constraint with the table in diffrent oracle DB.
    I tried creating a synonym through DBlink but mentioning that synonym as the refrence throws error:
    ORA-02444: Cannot resolve referenced object in referential constraints.
    The same thing works if refrenced tables reside in different schema on the same DB instead of different DB server.
    Can anyone help me out with the probable solution of achieving refrential integrity between tables across two diffrent DB.

    Thanx a lot to point out me , here i would like to make some coments point out me if i am worng.
    I have read the Kamal's comments "trigger cannot see
    unsaved changes",Its the same with constraint but
    deadlock occurs while with trigger dead lock will not
    occur.
    We have to be cautious against employing this design as APC said.I agree with APC comments regarding
    performance, any network problems can cause your
    application to slow down as well break down.
    What i suggested to OP create two trigger one at yours master site to prevent child deletion and another one at child site to check parent existence.
    As Anwar suggested a replica of master site at child site and then implement the FK at child site ,as well for preventing deletion vice versa agreed.
    Apart from designing i have also confronted the same scenario which lead me
    some complexity but compell to implement trigger (though at certain level it takes
    extra attention).
    So the only hope to redisgn but is there any alternate as opposed to redisgn???
    Khurram

  • Schema Speration vs. Refrential Integrity

    Newbie Question...
    I am developing a data model and someone mentioned to me when doing my data model for our specific app to group related objects in their own schema's. For example, Item data and related information like warranty, inventories, etc, should be all on one schema and objects like orders, sales, returns, etc. should all be in another schema and keep relationships within the schema but do not cross relationships over schema's. For example, and Item in on an order, so there would be some sort of integrity. The app should lookup the item number to see if it is valid, and then store the record in the object in the order object through code (pl/sql also). I hope this makes sense :)
    When running a query, just simply join the tables in the where clause and to mimic referential integrity. Is this a good practice when developing your database. Is this common? What are some good ways of doing this. Any recommendations? I have about 75 tables in my model and relationships are all over the place, and the person QA'ed my model and told me to do it his way (mentioned above). What should I do???
    Please help

    Well what is happening here is that we will at one point grow our DB to maybe well over 100 tables or downsize to maybe 30 tables. We are currently brining in Oracle Solutions here but the business still needs customized apps (being us). Years and years down the road, Oracle may bring in more off the shelf products that might obsolete us, in which case we will have to migrate our data to them and drop our tables. I see what you are saying about keeping all the tables in one schema and even about separating the tables in different schema's but keeping the integrity. What our plan was is to keep like all lookup tables in a separate schema, Order tracking details in another schema, etc. This seems like it would be easy to manage. Is this a good way of doing it or should I keep all the tables in one schema. This is my first time doing a project his big...I just want to do it the best way possible...and the right way!
    Please help!

  • Migrating MySQL5 database to Oracle 10g - refrential integrity constraints

    On migrating MySQL5 database to oracle, referential integrity constraints are not migrated to Oracle. Capture stage shows it has captured the constraints but constraints are missing from the Oracle Model. Is this a bug? If not, what I need to do to get constraints migrated.

    Check out *URGENT* Does Oracle SQL Developer able to migrate tables relationships?
    K.

  • Refrential Integrity Plugin in iplanet5.1

    Hi,
    I was tryign to test this feature, but it seems to be not working. when I delete an DN entry, the entry with an attribute to this DN is not getting updated.
    I see an error message in the errors log file. but The referint log file is not getting created. Anything I am missing in the configuration.
    its a single server/simple scenario.
    Thanks.
    Avijeet

    I am trying to use the regular ldapsearch command from commandline and not any C/Java code, like-
    ldapsearch -D "cn=directory manager" -p 1389 -h localhost -w abcd1234 -r -C PS:any:1:0 -b dc=abc,dc=com ou=emp
    despite using -r and -C option it just comes out displaying below entries
    ou=emp,dc=abc,dc=com
    ou=emp
    objectClass=top
    objectClass=organizationalUnit
    also tried with below command -
    ldapsearch -D "cn=directory manager" -p 1389 -h localhost -w abcd1234 -r -C PS:any:1:0 -b dc=techm,dc=com objectclass=person
    it displays long list and just comes out, not sure why its not doing the persistent search after using -r and -C options.
    request you to give some good idea on this.

  • What is mean by Referential Integrity? Where do we use it and Why..?

    Hi All,
    Can anybody tell me, What is mean by Referential Integrity? Where do we use it and Why..?
    Regards,
    Kiran Telkar

    Dear Kiran Telkar ,
    you might be knowing that generally refrential integrity is concerned with nothing but primary key and foriegn key relationship. Generally we use to check uniqueness of records.
    In sap we use it during flexible updation...to check the data records of transaction data and master data.
    In other words, to check before loading of data, that whether loading will be properly or not.
    we will check(tick) the option in the maintainance of the
    <b>infosource--> communication structure</b>
    it will be better if you clearly mention your problem, if further help is needed.
    hope this will help you.
    Regards
    vinay
    <i>please assign points to all who will help you.</i>

  • Whats is referential integrity

    Hi all,
    What is referential integerity found in the communication structure ? what is it significance and when do you use in real time scenario. pls explain with an real time scenario.
    what is cardinality ? and when shd u used cardinality and cardinality can be only defined in infocubes isnt it? can cardinality be defined in ods ?
    thanxs in advance
    hari

    Hi Hari,
      Referential Integrity: Its a DataBase concept which combines(logically) the data of multiple table with the master table. Here master table has the primary key and rest of the tables have the same key as forienkey.
    Here we use the characteristic (or combinations of Chars) call as referential char.
    check this.
    you might be knowing that generally refrential integrity is concerned with nothing but primary key and foriegn key relationship. Generally we use to check uniqueness of records.
    In sap we use it during flexible updation...to check the data records of transaction data and master data.
    In other words, to check before loading of data, that whether loading will be properly or not.
    we will check(tick) the option in the maintainance of the
    infosource--> communication structure
    it will be better if you clearly mention your problem, if further help is needed.
    http://help.sap.com/saphelp_nw2004s/helpdata/en/3a/14c43bb7137503e10000000a11402f/content.htm
    Referential integrity is the property that guarantees that values from one column depend on values from another column.This property is enforced through integrity costraints....
    see this link...
    http://help.sap.com/saphelp_nw2004s/helpdata/en/6d/117c5fd14811d2a97400a0c9449261/content.htm
    Please find the below url
    https://www.sdn.sap.com/irj/sdn/advancedsearch?query=referential%20integrity&cat=sdn_all
    thanks
    @jay

  • Transformation Integrity

    Hi,
    I have a problem. I´ve a transformation rules with several infoObjects most of them with the integrity check activated. I want to remove that integrity of one of them  but after save and activate that check is still there.
    Thanks in advance.

    Hi,
    I don't think this is possible to keep refrential integrity of all the infoobjects except one. The only case I think it is possible is you will have to change the infoobject from Master data to normal Infoobject i.e. Unmarke the Master data check.
    Because when SAP check the refrential integirity just before writing data to target there is no step available where you can skip some particular value.
    But reverse secnario is possible i.e. You want to remove the integrity check for all the infoobject but want to keep for only one.
    Regards,
    Durgesh.

  • BO XI 3.1 + Essbase hyperion 9.3

    I'm having a little problem when I create a report based on Web Intelligence in the universe with the integration of Hyperion cubes, because when I view my document only dimensions in the values are displayed normally, but when I try to run the query with objects of metrics, the result comes empty.
    I'm using BO XI 3.1 SP2
    If anyone has encountered this problem, I accept suggestions and assistance on how to solve this problem.
    Thanks!

    If Essbase works like any other OLAP processing engine, like BW...
    Then, whenever you select no measures you are querying over the dimension tables, not the cube.
    Even more then one dimension will result in a cartesian product between the two.
    So only when you select a measure, it will have a look at the actual contents of the cube.
    So I would have a look at the raw cube contents and try to see if there is some condition preventing you from seeing this data.
    Hope this helps,
    Marianne

  • Getting error "1013009 Administrator Has Temporarily Disabled User Commands

    Hi All,
    I am getting the error"1013009 Administrator Has Temporarily Disabled User Commands" while executing a report script in Essbase 11.1.1.3
    Appreciate any help..
    Thanks
    Mahesh

    Mahesh wrote:
    Hi All,
    I am getting the error"1013009 Administrator Has Temporarily Disabled User Commands" while executing a report script in Essbase 11.1.1.3
    Appreciate any help..
    Thanks
    Mahesh
    Possible Cause
    When a database is being restructured or any application/database on the server is being copied, you can get this message.
    or
    When a cube is being restructured, commands are restricted because the integrity of the cube has to be stable and no one is allowed to access it.
    or
    Copying an application requires that the Essbase security file be in read/write mode and therefore other applications are not accessible until the process is completed.
    Possible Solution
    In Application Settings, verify that the Allow Commands or Allow Updates options are not selected.
    If not selected select those..and try
    Regards,
    Prabhas
    Edited by: P on Apr 7, 2011 3:36 PM
    Edited by: P on Apr 7, 2011 3:38 PM

  • Best way to aggregate large data

    Hi,
    We load actual numbers and run aggregation monthly.
    The data file grew from 400k lines to 1.4 million lines. The aggregation time grew proportionately and it takes now 9 hours. It will continue growing.
    We are looking for a better way to aggregate data.
    Can you please help in improving performance significantly?
    Any possible solution will help: ASO cube and partitions, different script of aggregation, be creative.
    Thank you and best regards,
    Some information on our enviroment and process:
    We aggregate using CALC DIM(dim1,dim2,...,dimN).
    Windows server 64bit
    We are moving from 11.1.2.1 to 11.2.2
    Block size: 70,000 B
    Dimensions,Type, Members, Sparse Members:
    Bold and underlined dimensions are aggregated.
    Account
    Dense
    2523
    676
    Period
    Dense
    19
    13
    View
    Dense
    3
    1
    PnL view
    Sparse
    79
    10
    Currency
    Sparse
    16
    14
    Site
    Sparse
    31
    31
    Company
    Sparse
    271
    78
    ICP
    Sparse
    167
    118
    Cost center
    Sparse
    161
    161
    Product line
    Sparse
    250
    250
    Sale channels
    Sparse
    284
    259
    Scenario
    Sparse
    10
    10
    Version
    Sparse
    32
    30
    Year
    Sparse
    6
    6

    Yes I have implemented ASO. Not in relation to Planning data though. It has always been in relation to larger actual reporting requirements. In the new releases of Planning they are moving towards having integrated ASO reporting cubes so that where the planning application has large volumes of data you can push data to an ASO cube to save on aggregation times. For me the problem with this is that in all my historical Planning applications there has always been a need to aggregate data as part of the calculation process, so the aggregations were always required within Planning so having an ASO cube would not have really taken any time away.
    So really the answer is yes you can go down the ASO route. But having data aggregating in an ASO application would need to fit your functional requirements. So the biggest one would be, can you do without aggregated data within your planning application? Also, its worth pointing out that even though you don't have to aggregate in an ASO application, it is still recommended to run aggregations on the base level data. Otherwise your users will start complaining about poor performing reports. They can be quite slow, and if you have many users then this will only be worse. Aggregations in ASO are different though. You run aggregations in a number of different ways, but the end goal is to have run aggregations that cover the most commonly run reporting combinations. So not aggregating everything and therefore quicker to run. But more data will result in more time to run an aggregation.
    In your post you mentioned that your actuals have grown and the aggregations have grown with it, and will continue to grow. I don't know anything about your application, but is there a need to keep all of your actuals loading and aggregating each month? Why don't you just load the current years actuals (Or the periods of actuals that are changing) each month and only aggregate those? Are all of your actuals really changing all the time and therefore requiring you to aggregate all of the data each time? Normally I would only load the required actuals to support the planning and forecasting exercise. Any previous years data (Actuals, old fcsts, budgets etc) I would archive and keep an aggregated static copy of the application.
    Also, you mentioned that you did have calc parallel set to 3 and then moved to 7. But did you have the TASK DIMS set at all? The reason I say this is because if you didn't then your calc parallel would likely give you no improvement at all. If you don't set it to the optimal value then by default it will try to paralyze using the last dimension (in your case Year), so not really breaking up the calc ( This is a very common mistake that is made when CALC PARALLEL is used). Setting this value in older versions of Essbase is a bit trial and error, but the saying goes it should be set to at least the last sparse aggregating dimension to get any value. So in your case the minimum value should be TASK DIM 4, but its worth trying higher, so 6. Try 4 then 5 and then 6. As I say, trial and error. But I will say one thing, by getting your calc parallel correct you will save much more than 10% on aggregations. You say you are moving to 11.1.2.2, so I assume you haven't run this aggregation on that environment yet? If so the TASK DIM setting is not required in that environment, essbase will calculate the best value for you, so you only need to set CALC PARALLEL.
    Is it possible for you to post your script? Also I noticed in your original email that for Company and ICP your member numbers on the right are significantly smaller than the left numbers, why is this? Do you have dynamic members in those dimensions?
    I will say 6 aggregating dimensions is always challenging, but 9 hours does sound a little long to simply aggregate actuals, even for the 1.4 millions records

  • Enabling Line-item Details

    HI Folks
    I am trying to test the feature called Line Item details. At the moment all accounts have this set to No. I ttried to change this on an existing accounrt to Y and load but I got an error message saying refrential integrity. I then tried to create a new account and uploaded this. It loaded burt when I went to a data grid and rightclicked there was no option for line items so I thought I might be missing something.
    I looked in my admin guide and there is another section in the metadata on the scenario which has the same attribute so I changed this to a Y but again when I try to reload I get the Refrential integrity error.
    Has anyone tried to set this up before particulary on existing accounts. Does the scenario also need this enabled or does that attribute refer ro something else. I havent used this featurre before but looks like it might be useful. Any help/advice is most appreciated
    Thanks
    Tahir

    The referential Integrity check come because you may have some data in the existing account as well. As this account did not support Line Item Details previously. So what you need to do is while loading metadata you check clear all metadata before loading. This might help you.
    Yes this needs to be enabled for the scenario in the metadata.
    Hope this is helpful
    Varun

  • Write Back from OBIEE 11g to multiple tables in DB

    Hi All,
    We have requirement to write-back from OBIEE 11g to multiple tables in DB.
    1) Inserting a new row in report. Inserting record into multiple tables through OBIEE?
    2) In report we have fields like Region Id, Product Id, Date, Quantity. Region Id, Product Id, Date are dimension fields and Quantity is Fact measure while inserting row in report does OBIEE checks Refrential Integrity Constraint during Write Back on DB.
    Let me know if you need more details on this. Thanks in Advance.
    Regards,
    Rajkumar.

    Hi,
    1) With regard to inserts into multiple tables, try using multiple insert tags (I haven't tried it) but I think it should work.
    2) I don't think OBIEE would check referential integrity , but DB would and give you appropriate response.
    Let us know how you got on...
    Thanks & Regards

  • Extract tables from "Oracle Apss 11.0.3" and load it to "Oracle Apps 11i"

    Hi hussein,
    I am tasked to extract the following tables from Oracle Apps 11.0.3 and load it to Oracle Apps 11.5.10.2.
    PO_VENDORS
    PO_VENDOR_SITES_ALL
    Can I use export / import?
    Thanks a lot

    Hi,
    I believe export/import of individual apps tables is not supported due to objects dependency and refrential integrity.
    Regards,
    Hussein

Maybe you are looking for

  • Hello, I am not able to export a movie in IMovie 10.0.1.

    Everytime when I export a movie (for example as file, to I-Tunes or to the theatre) on my Mac Book Pro (from Februay 2013) I receive an error message. This Error Message is: "The Export of the Movie XY failed - Error during rendering of the video: -5

  • Avi codec pack for mac??

    Hi, We have recently switched over a few mac machines and are slowly getting the hang of them. After having issues playing back avi's on the machine i have downloaded the Perian codec pack for mac which allows me to play the files in quicktime etc an

  • Removing commas from numbers

    his has a couple of items in it.... In Numbers, I formatted a phone number column imported from a Text file, but it put in commas, like the phone is a number: 4157933161 is 4,157,933,161. Can I get rid of those commas? Also, I couldn't import a .txt

  • Adobe says Activation is on 3 PC's?

    I have 2 PC's with CS4 I deactivated one (Win XP), removed it, and reinstalled onto a new PC (Win 7) CS4 kept crashing on Win 7. Maninly Premier and After Effects, locks then crashes Drives were corrupted and everything was backed up to another drive

  • Update to Snow Leopard & Mavericks?

    Hi, so I own a 2007 iMac (Aluminum). It had 1GB if RAM i believe. It was upgraded to 2GB. I already ordered snow leopard from the apple store and should be here any day. I'm just wondering if its a good move. It is currently running 10.4.11 and some