Double data records in the cube.
Hi experts
when we are loading the data from DSO to infocube by using DTP ,
the Problem is its updating double records in the infocube, with same record and the key figure values zero .
in the transformations we don't have any start routine, end routine ..
Please could any tell me how to overcome from this
thanks
kumar
Hi Fredrik,
not all settings in infopackages work in chains in the same way they do while running the package manually. Mostly you can check that with pressing F1 on the setting. In your case, you need to add a process type for deleting the data to the chain. In your chain maintenance, look at process types and then in load processes .... There you will find the type you need.
kind regards
Siggi
Similar Messages
-
Identifing duplicate master data records using the MDM Import Manager
hi all
I read the Topis "How to identify duplicate master data records using the MDM Import Manager</b>"
i tried to create import maps and to set rules. but when i import them it creates the new vendor records for each rule with rest of the fields blank.
when i import vendor data all the three fields i.e Match rate , match type and match group are blank.
My Question is :
I am getting vendor data from SAP R/3.
In which source (in lookup xml file or data xml file) do i have to include these above three fields and how all the rules will be reflected in repository?Hi Sheetal
Here we go when do you Import any data (vendor master) please follow the following steps;
1. First of all apply the map to the source data
2. In the Match Record tab there are 3 possiblities
a.[Remote Key] : Checks the current source rec with
repository along with all the fields - This is
default
b.Remove [Remote key] - by double click the same; and
choose any single fields like Vendor Number or
name - Then the current record will be matched
with the repository based on the field.
c.Instead of single field you can choose combination
also.
3. Based on the Match results, match class will be set
automatically;
a. None
b. Single
c. Multiple
4. Then Match Type
a.Exact-All the individual value matches are Equal.
b.Partial-At least one value match is Equal and at least one Undefined; no value matches are Not Equal.
c.Conflict-At least one value match is Equal and at least one value match is Not Equal.
5. then chek the Import status and Execute the import.
Hope this helps you.
cheers
Alexander
Note: Pls dont forget reward points. -
Delete blank records from the cube
Hello;
We just upgraded from 3.5 to 7.0. When we started running our queries for the Inventory cube, we were getting an error message of "Error: The validity interval has the initial value as lower limit".
When I checked the cube, I found records that were blank, such as no dates, plants, etc.. I then tried to see if I can delete those records, but I cannot find a way to do this. Can someone tell me how to delete the blank records or any records from the cube directly? When I run the queries from the ODS, I do not get any errors at this time.
Any information, suggestions or HELP is greatly appreciated.
Thanks,
Maximina BarryHello Maximina,
You could try writing in some code that would delete the records which have some fields as blank in the start routine of the transformation that connects the ODS to the cube . So that the next time you load data from the ODS to the cube .. they would stay deleted .
Hope it helps.
Just to add ...
refer -> Need Sample Code to delete records from Cube using SE38 Editor
Thanks,
~Vj
Message was edited by:
Vijay Gopinath -
Customized delta data source for deleting data record in the source system.
Hello Gurus,
there is a customized delta data source, how to implement delta function for deleting data record in the source system?
I mean if there is record deleted in the source sytem, how to notify SAP BW system for this deleting change by this customized delta
data source?
Many thanks.Hi,
when ever record deleted we need to write the code to insert the record in Z table load this records into BW in a cube with similar structure.while loading into this cube multiply the Keyfigure by -1.
add this cube in the Multi Provider.The union of the records in the orginal cube and the cube having deleted records will result in zero vale and will not be displayed in report .
Regards, -
End Routine - Modify a record in the cube
Hello Guys,
In the end routine I have to update a field . The transformation is the same cube to the cube. I want the record to be modified like add a value to a field in the end routine. Since it is a cube it creates a new record instead of overwriting the existing record. Is there anyway I can modify the record. I know I can make the existing record 0.00 and then create a new record with the new value.Is there any other solution.
For example:This is the existing record in the cube
sales order Item No Backlog Amount Indicator
1000 10 1000.00
After applying the end routine it has 2 records ( I modify the record with the indicator value)
sales order Item No Backlog Amount Indicator
1000 10 1000.00
1000 10 1000.00 REV
After applying the end routine I need the record to be like overwrite(similar to DSO)
sales order Item No Backlog Amount Indicator
1000 10 1000.00 REV
How to achieve the above result in end routine.
Thanks
SenthilHi there,
Since you create new records in end routine in the InfoCube, why not delete the old ones?
You can use the
delete RESULT_PACKAGE where ...
Therefore deleting the old records after inserting the new ones.
Diogo. -
Report does not show data , but data exists in the cube.
Hi All,
I have a situation where I could not show the data in the report. When I load data from an extractor 0CO_OM_WBS_1 into a Cube directly I am able to show the data in my report. When I load the same extractor to a DSO and from the DSO when I load it into the Cube, the data does not show up in the query. To check the data I use the same restriction and could see the data reside in the cube (LISTCUBE). I compressed the requests, still it is not showing up in the query. No aggregates create on the cube.
It shows the data if I load directly from the extractor, but not when I load data thru DSO.
Any ideas.
Alex(Arthur Samson)Hi Alex,
I am facing same problem, i have data in cube but in report i cant see....
i have created a generic DS, i loaded the data to DSO then CUBE.
data is loaded succesfully and i can see data in CUBE ( Manage ), when i am running the repory i am not getting data.
i think you solve this issue... plz help me to resolve this issue.
Regards,
SHAIK. -
Looking for a specific data in all the cubes and ods
Hi Gurus
"i am looking for all the cubes/ods that contain a specific Controlling area(lets say 0123) and a specific 0plant (lets say plant 4567), now i can go down to every cube and ods and search for it in its contents but i have like hundereds of cubes it will take days, is there a simple way to look for some particular data in all the cubes/ods, and it tells me which cube/ods contains these plants and controlling area."
<b>now based on this above post i got a reply that abaping can help.</b>
"you could write an ABAP where you call for every InfoProvider function RSDRI_INFOPROV_READ_RFC like
loop at <infoprov-table> assigning <wa>.
call function 'RSDRI_INFOPROV_READ_RFC'
exporting
i_infoprov = <wa>
tables
i_t_sfc = i_t_rsdri_t_sfc
i_t_range = l_t_rsdri_t_range
e_t_rfcdata = l_t_rsdri_t_rfcdata
exceptions
illegal_input = 1
illegal_input_sfc = 2
illegal_input_sfk = 3
illegal_input_range = 4
illegal_input_tablesel = 5
no_authorization = 6
generation_error = 7
illegal_download = 8
illegal_tablename = 9
illegal_resulttype = 10
x_message = 11
data_overflow = 12
others = 13.
endloop.
i_t_sfc should contain 0PLANT and i_t_range the restriction on you plant value.
with a describe table statement on l_t_rsdri_t_rfcdata you can get the hits.
check test program RSDRI_INFOPROV_READ_DEMO for details
best regards clemens "
<b>now my question is how do i use this code to check each and every cube in bw, it seems like it is meant to be for only one cube at a time. and what does he mean by "for every infoprovider function"</b>
thanksTHANKS
-
Revaluate data record at the time of loading from flat file or BI Cube
Hello Friends,
I want to revaluate a data record at time of loading using Transformation or Conversion File, based on some condition.
Like, I have a rule to identify that a record is supposed to be multiplied by -1 or not.
For example,
*if (ID(1:5) = str(00070) then(Record-1)
ID(1:5) = str(00071) then (Record-2)
Can you please guide me how can I achieve this by using Transformation file or Conversion file?
Regards,
Vishal.Hi Nilanjan,
Thanks for reply.
I tried the script you suggested in conversion file for Account.
But It is not working for me.
Even I tried simple multiplication and also addition in Formula column it is not working.
External --> *
Internal --> *
Formula ---> Value * -1
Above conversion file for Account was not working for me.
then I tried
Formula --> Value + 100
It also did not work for me.
Kindly suggest if I am doing anything wrong in above file.
Thanks,
Nilanjan. -
Duplicate Data Records indicator / the handle duplicate records
Hi All,
I am getting double data in two request. How can I delete extra data using "Duplicate Data Records indicator ".
I am not able to see this option in PSA as well as DTP for "the handle duplicate records".
can u help me to find the option in PSA/DTP.
Regards
Amit SrivastavaWhat Arvind said is correct.
But if you can try this out in an End Routine, this may work, Not sure though.
Because then you will be dealing with the entire result_package.
Also, say if the target that you are talking about is a DSO, then you can Delete Adjacant Duplicates in Start Routine while updating it into your next target. That can be a cube, for ex. -
hello,
I have created four dimensions, all validated, deployed and loaded successfully.
I created a cube, and used a joiner to load the keys from the tables. It has just one measure, and I am loading it from a different table.
I ran a query with a join condition (where clause) in Oracle SQL developer, it returns the desired result.
So i use the same join condition, in the cube.
the cube and the corresponding map is validated and deployed successfully. When I start the map, it inserts no, records in the corresponding table of the cube.
Please, help me solve this problem.
best regards
P.S: I am using OWB 11g on SUSE 10.2The loading process in a cube do a join with the dimensions.
"D_DIMENSION_X"."DIMENSION_CODE" = "INGRP1"."D_DIMENSION_CODE" )Thus, you can have no problem even if a dimension code is missing in the dimension table.
I had the problem with the time dimension (miss the year id for instance) and it's very painful to find.
First, you have to retrieve the SQL generated :
Check in this article how to have the SQL (intermediate sql generation)
http://gerardnico.com/wiki/dw/etl/owb/owb_mapping_debugger
Second, Take the sql and execute it by suppressing a dimension, see if you have data in your sql
and repeat the steps until you see the light.
Good luck
Nico -
How to differentiate the records in the cube
Hi experts,
I get data from 3 dso's to a cube through 3 DTP's.
In the cube how to recognize that a particular record came from a particular dso.
I thought of identifying it with request ids but for delta loads daily different req ids will be coming.
Can we populate the dtp name into the cube ?
If so, how ?
Regards,
Bhadri M.Yes Bhadri,
this can be done by adding another char in the cube and populating this char with diff values, lets say 'A' for DSO1, B for DSO2 and similar for third.
This char would help us identify at a later point the records source.
Naveen.A -
Error while loading data on to the cube : Incompatible rule file.
Hi,
I am trying to load data on to essbase cube from a data file. I have a rule file on the cube already, and I am getting the following error while loading the data. Is there any problem with the rules file?
SEVERE: Cannot begin data load. Essbase Error(1019058): Incompatible rule file. Duplicate member name rule file is used against unique name database.
com.essbase.api.base.EssException: Cannot begin data load. Essbase Error(1019058): Incompatible rule file. Duplicate member name rule file is used against unique name database.
at com.essbase.server.framework.EssOrbPluginDirect.ex_olap(Unknown Source)
at com.essbase.server.framework.EssOrbPluginDirect.essMainBeginDataload(Unknown Source)
at com.essbase.api.session.EssOrbPlugin._invokeMainMethod(Unknown Source)
at com.essbase.api.session.EssOrbPlugin._invokeMethod2(Unknown Source)
at com.essbase.api.session.EssOrbPlugin._invokeMethod(Unknown Source)
at com.essbase.server.framework.EssOrbPluginDirect._invokeProtected(Unknown Source)
at com.essbase.api.session.EssOrbPluginEmbedded.invokeMethod(Unknown Source)
at com.essbase.api.session.EssOrbPluginEmbedded.invokeMethod(Unknown Source)
at com.essbase.api.session.EssOrbPlugin.essMainBeginDataload(Unknown Source)
at com.essbase.api.datasource.EssCube.beginDataload(Unknown Source)
at grid.BudgetDataLoad.main(BudgetDataLoad.java:85)
Error: Cannot begin data load. Essbase Error(1019058): Incompatible rule file. Duplicate member name rule file is used against unique name database.
Feb 7, 2012 3:13:37 PM com.hyperion.dsf.server.framework.BaseLogger writeException
SEVERE: Cannot Load buffer term. Essbase Error(1270040): Data load buffer [3] does not exist
com.essbase.api.base.EssException: Cannot Load buffer term. Essbase Error(1270040): Data load buffer [3] does not exist
at com.essbase.server.framework.EssOrbPluginDirect.ex_olap(Unknown Source)
at com.essbase.server.framework.EssOrbPluginDirect.essMainLoadBufferTerm(Unknown Source)
at com.essbase.api.session.EssOrbPlugin._invokeMainMethod(Unknown Source)
at com.essbase.api.session.EssOrbPlugin._invokeMethod2(Unknown Source)
at com.essbase.api.session.EssOrbPlugin._invokeMethod(Unknown Source)
at com.essbase.server.framework.EssOrbPluginDirect._invokeProtected(Unknown Source)
at com.essbase.api.session.EssOrbPluginEmbedded.invokeMethod(Unknown Source)
at com.essbase.api.session.EssOrbPluginEmbedded.invokeMethod(Unknown Source)
at com.essbase.api.session.EssOrbPlugin.essMainLoadBufferTerm(Unknown Source)
at com.essbase.api.datasource.EssCube.loadBufferTerm(Unknown Source)
at grid.BudgetDataLoad.main(BudgetDataLoad.java:114)
Error: Cannot Load buffer term. Essbase Error(1270040): Data load buffer [3] does not exist
Thanks,
Santhosh" Incompatible rule file. Duplicate member name rule file is used against unique name database."
I am just guessing here as I have never used the duplicate name functionality in Essbase, nor do I remember which versions it was in. However with that said I think your answer is in your error message.
Just guessing again.... It appears that your rule file is set to allow duplicate member names while your database is not. With that information in hand (given to you in the error message) I would start to explore that. -
Postion of data record in the table control
Hi all,
I am working on a module pool that has a table control which fetches the data from the Transparent table.
Suppose there is data in the z-table formed and table control is showing the data record.
i want to text boxes besides the table control, one of which will show me the value index of top row of the record set and other one will show me the value of last row appearing in that page of the table control.
if i press page down then i should get new values in both the text boxes .
please help me to get an idea what vales i will take to show the indexes.
thanks
ektaThanks for you help.
i have used a value 'n' which is no of rows that are on the one page of the table control.
so when i do page down it will show me next values i.s values of the index currently on the table control next page.
anyways thanks a lot -
Data Load into the Cube based on Fiscal Year
Hi All,
I was told to load the data into the cube coming from 3 different datasources, but based on fiscal year. I was told to load the data for 2010, 2009 and so on.
Any suggestions please...Hi Dear,
Write the following code in start Routine of Update Rule:-
In BI 3.x
Delete DATA_Package where calday lt '20090101' and calday gt '20091231'.
to load data onlyfor 2009.
same thing for 2010.
In BI 7.0
Start Routine of Transformation:-
delete source_package where calday lt '20090101' and calday gt '20091231'.
Regards
Obaid -
Can i get the data file from the cube
Hi All,
Previously I created and activated one cube using one data file, by mistake lost that data file. Is there any chance to get that data file from that Cube?
Thanks
BhaskarHi Paul,
yes you can..
1) If you have loaded through PSA , Then goto that PSA table and from settings menu> change display variants-> view tab> view as Excel> then save this file
2) If you have not used PSA.then use LISTCUBE to view the data and change settings to excel and download..
Don't forget to assign points,If it is useful..
Further queries always welocme..
Regards.
Pradeep choudhari
Maybe you are looking for
-
Retrieving old column values from a after delete trigger dynamically
I have a single audit table for deletions and the table structure has the following columns 1: table_name 2: column_name 3: column_value I am writing a trigger on tables for which i want to have delete audits and i am dynamicaly retrieving the column
-
Reg: Creation of new Material type
Hai Guys, Jino here,can anyone please tell me the procedure for creating my own "MATERIAL TYPE" while creating a new material. Regards Jino.
-
Inserting Date AND Time using DBMS_XMLStore.insertXML
Hello, I'm using a c# app to create an xml file to be passed to a stored procedure. The store procedure then uses DBMS_XMLStore.insertXML to insert the record in the xml file into the database. There is a field in the Oracle table called "ENTER_DATE"
-
How do I get a java game from my pc to my mobile - I have a wire!
I hava a sony ericsson t610 and I have a wire that connects my phone to my computer but the standard sony ericsson software does allow you to send games to the phone. Is there and software I could get to do this?
-
How do you get music off the computer to your iphone?
How do I transfer music I have on my computer to my iphone 5c