0RecordMode, ODS and Generic Data
I have created a Generic Datasource in R3 from a Y table.
I will not be doing a generic delta on it.
I replicated the datasource to BW and activated the necessary Transfer & Communication Structure.
Now i want to store all the data in an ODS, but when i try to create Update rules for the ODS, it says 0RecordMode not found.
I dont want to use delta and hence the help says i can do without it.
But still I am not able to create update rules.
Kindly help
Pranab Sharma
Hi Pranab,
You can ignore the warning and create update rules for the ODS, generally 0recordmode is assigned to ROCANCEL field. if u dont have that in ur infosource the system gives a warning u need not consider this.
regards,
Similar Messages
-
ODS in Generic Data source?
Dear Friends,
I need to do a generi extraction from a view(2 tables). This is is a transactinal extraction.
How can I take a decision weather I have to build ODS or not. Because we know from. 3.x onwards, The delta is enabled for Generic. What is the other purpose to use the ODS in Generic...
Thanks,SudhaA simple example
If you have a document that have 3 fields document number, material and quantity
and in R3 you change the material of one documento and load it to a infocube in BW infocube you would have 2 records one with old material and one with new material in this case the old document is not replaced with new one.....
In an ODS you use document number as key field and others are data fields........ if you change the material and load again in ODS this old document would bew replaced with new one and only new exists as active in ODS ......next you could load data from ODS to infocube
ODS is useful if you load data at document level....
I hope this helps you.....
Regards -
Look up two ODS and load data to the cube
Hi ,
I am trying to load data to the Billing Item Cube.The cube contains some fileds which are loaded from the Billing Item ODS which is loaded from 2LIS_13_VDITM directly from the datasource, and there are some fields which needs to be looked up in the Service Order ODS and some fields in the Service Orders Operations ODS.I have written a Start Routine in the Cube update rules.Using the select statement from both the ODS i am fetching the required fields from the both ODS and i am loading them to the internal tables and in the Update rules i am writing the Update routine for the fields using Read statement.
I am getting an error when it is reading the second select statement(from second ODS).
The error message is
You wanted to add an entry to table
"\PROGRAM=GPAZ1GI2DIUZLBD1DKBSTKG94I3\DATA=V_ZCSOD0100[]", which you declared
with a UNIQUE KEY. However, there was already an entry with the
same key.
The error message says that there is an Unique Key in the select statement which is already an entry.
Can any one please help me in providing the solution for this requirement.I would appreciate your help if any one can send me the code if they have written.
Thanks in Advance.
BobbyHi,
Can you post the select statements what you have written in Start routine.
regards,
raju -
Hi All,
Please let me know the difference between ODS and Cube practically with a simple example. I red and know diferences but I did't get the concept.
1. What detail level reporting. How ODS supports this and not in Cube?
2 .When we go for ODS(daily/latest info) and Cube(monthly/ aggreated info )?
3.Where can we see the diff. I know overwrite & addition?
Please explain me with example.Surely I will assign points.
Thanks in advance.
Cheers,
Sri.HI ALL,
Thanks for your immediate replies.
I did a file upload with full load to ODS:
1st load
======
CID PRICE QTY
C01 10 1000
C02 20 2000
C01 20 2000
In Active table & change logs:
======================
CID PRICE QTY
C01 20 2000
C02 20 2000
2nd load
======
CID PRICE QTY
C01 35 1500
C02 40 2200
C03 5 100
2nd load Active table
=================
CID PRICE QTY
C01 35 1500
C02 40 2200
C03 5 100
In Change table
======================
CID PRICE QTY
C01 20 2000 N
C02 20 2000 N
C01 35 1500
C01 20- 2000- X
C02 40 2200
C02 20- 2000- X
C03 5 100 N
I generate report on ODS, but we I can't see any details. It shows only latest info , what we have in Active table.
We can do the same detailed in info cube also, if we check DB aggregation.
I have a doubt:
customer orders in a day :
CID PRICE QTY DAY
C01 5 15 2007.05.03
C01 5 25 2007.05.03
C01 5 35 2007.05.03
C01 5 45 2007.05.03
we load this 4 times and overwrites evry time and see the latest record.
In ods : we can see :this only : latest :
C01 5 45 2007.05.03
In Cube: we can see
CID PRICE QTY DAY
C01 5 15 2007.05.03
C01 5 25 2007.05.03
C01 5 35 2007.05.03
C01 5 45 2007.05.03
and <b>if we check DB aggregation.</b> we can see
C01 5 120 2007.05.03 (summed all)
With my know knowlege we can see detail BW report on cube . all 4 records we can see , but in ODS we ca see this rec
C01 5 45 2007.05.03 only
Please explain , how ODS support detail record and cube supports summarized.
Thanks in advance.
Cheers,
Sri. -
Line items AND GENERIC EXTRACTION
what does line items exactly mean in DATASOURCES
AND WHAT ARE DELTA TYPE EXTRACTIONS IN GENERIC
TIME STAMPING
CALENDAR DAY
NUMERICAL POINTER
COULD ANY PLEASE EXPLAIN THE DIFFERNCE AND IN WHAT SCENARIOS WE USE IT ...
LOOKING FOR YOUR REPLYHi Guru,
Check below doc & thread for Line Item Dimension:
https://www.sdn.sap.com/irj/sdn/go/portal/prtroot/docs/library/uuid/a7f2f294-0501-0010-11bb-80e0d67c3e4a
Line Item Dimenstion
If a field (date, progressive document number, timestamp) exists in the extract structure of a DataSource that contains values which increase monotonously over time, you can define delta capability for this DataSource. If such a delta-relevant field exists in the extract structure, such as a timestamp, the system determines the data volume transferred in the delta method by comparing the maximum value transferred with the last load with the amount of data that has since entered the system. Only the data that has newly arrived is transferred.
To get the delta, generic delta management translates the update mode into a selection criterion. The selection of the request is enhanced with an interval for the delta-relevant field. The lower limit of the interval is known from the previous extraction. The upper limit is taken from the current value, such as the timestamp or the time of extraction. You can use security intervals to ensure that all data is taken into consideration in the extractions (The purpose of a security interval is to make the system take into consideration records that appear during the extraction process but which remain unextracted -since they have yet to be saved- during the next extraction; you have the option of adding a security interval to the upper limit/lower limit of the interval).
After the data request was transferred to the extractor, and the data was extracted, the extractor then informs generic delta management that the pointer can be set to the upper limit of the previously returned interval.
To have a clear idea:
1. If delta field is Date (Record Create Date or change date), then use Upper Limit of 1 day.
This will load Delta in BW as of yesterday. Leave Lower limit blank.
2. If delta field is Time Stamp, then use Upper Limit of equal to 1800 Seconds (30 minutes).
This will load Delta in BW as of 30 minutes old. Leave Lower limit blank.
3. If delta field is a Numeric Pointer i.e. generated record # like in GLPCA table, then use
Lower Limit. Use count 10-100. Leave upper limit blank. If value 10 is used then last 10
records will be loaded again. If a record is created when load was running, those records
may get lost. To prevent this situation, lower limit can be used to backup the starting
sequence number. This may result in some records being processed more than once.
Refer this link from help.sap.com
http://help.sap.com/saphelp_erp2005/helpdata/en/3f/548c9ec754ee4d90188a4f108e0121/frameset.htm
Difference between timestamp used in copa and generic data extraction?
COPA timestamps vs Generic timestamps
Check this doc for more info:
https://www.sdn.sap.com/irj/servlet/prt/portal/prtroot/docs/library/uuid/84bf4d68-0601-0010-13b5-b062adbb3e33
Hope these helps u...
Regards,
KK. -
Generic Data source Issue with Infoset
Hi Experts,
I have created generic data source with Infoset.As per customer requirement, I have to modify join condition between tables from equal to left outer join in infoset.I have tested scenario successfully using infoset query. I have done below steps in generic data source to check the same result.
1.Modified Join condition in infoset and generated the infoset again.
2. Generated generic datasource in RSO2.
When I check same scenario in RSA3, I am getting 0 records.Please help me out.
Thanks in Advance.
Regards,
RajuThanks Sriram for your reply.
I am using same user id for both SQ02 and RSA3.
I have Two infosets with same join conditions.I have created SQVI report with First infoset and Generic data source with second infoset.
I have changed join condition second infoset which is used for generic data source.I am still getting same number of records in both SQVI report and in RSA3.I have validated records by considering few records to cross check again.
The change in join condition doesnt show any impact in changing the number of recrods.
I have activated extraction structure in RSA6 but still getting same records.
Thanks,
Raju -
Bi 7.0 for cube and master data steps
hi friends,
can u give me step by step comparision between bi 7.0 and bi 3.0 for cube ods and master data, ple explaing me what are the steps difference for those in bi 7.0 .
and reporting is same or any difference in bi 7.0 .
other than sap.help.com material, if u have any other or any screen shots. ple send me.
Thanking u
suneel.what abt master data, in master data we are using only transfer rules, here also we are using transformation and dtp again.
Yes we would be using transformations and DTP's on the masterdata too. but the system would prompt to make IO a infoprovider in order to use the new functionality. There is another work around for it, but we can discuss it later, orelse search the forum for that option, i have replied to couple of questions regarding that.
what abt reporting any things other than calculated kf restricted kf, filters rows columns, free chaststics, variables, any chnages in screen.
The basic functionality of all the above mentioned objects havent changes, and the query designer itself, there are hell a lot of changes in the appearance and it will take a while to get used to it and find your way with all the functionalities.
i have doubt bps menas businees planning simulation what is this, bec i have idea we can use transactional cube and transactional ods for write accress for planninmg also
BPS is something I have no clue abt... (you dont see me answering in Business planning forum) -
0RecordMode implementation in Generic Extractor & ODS
Hi All,
We have scenario of generic extractor supplying data to ODS . Now we want to implement deletion functionality using record mode .
We have added 0recordmode to ODS and infosource . What additional thing we need to do in case of achieving deletion of data from ODS . ( Tried supplying record mode D to ODS but system doesnt delete the record).
One more thing is in ods we have 4 keyfields out of which we are able to supply only 3 fields in the infosource which supplies data deletion records.
Could any body please explain as how to achieve this ...
Thanks !
Regards
Mr KapadiaHi Kapadia,
In transfer rules level change the record mode to R rather than D. Let me know if you have any questions.
Assign points if it helps.
Regards
Satish Arra -
Data not matching for a key figure in ODS and In CUBE.
There is one key figure ( Net Units) where in the data is not matching in ODS and in CUBE for only one particular fiscal week (200714)but for the rest of the weeks its matching.
The data is first loaded in to ODS and from there in to CUBE and the source is from flat file.
The cubes are not compressed .so I checked the DB aggregation in the field selection for output screen but still its not matching .( based on the infocube addition property some times if a file is loaded multiple times it adds up in the infocube so when the cubes are not compressed we manually select this option) .
What might be the possible cause for this ???
thanks in advance
jayakrishnaHI,
By any chance if you have loaded same the delta twice into the cube, then there might be the chance of the corrupt data. SO identify the lowest level of document where u can identify the difference.I mean if possible 0calday wise or document leve wise or sales org wise etc. and try to find out where is the difference. Then check in the change log of the oDS, whether the request has gone into cube twice. last option is selectively delete the data from the cube for the week and load a full repair from the ODS for that week.
Sriram -
Dear Friends,
Why we are loading 0FI_GL_4 data into first ODS and than Cube.
Both are daily delta uploads.
Is there any special reason?
Regards
ramuFrom a business perspective you can change a GL document therefore you need the DSO to resolve the duplicates for an after image extractor
However for CCA - you cannot change the source documents therefore in CCA even though it is an after image extractor you do not need a DSO to resolve this -
How to do reconcilization of ODS data and CUBE data
Hi All,
How to do reconciliation of ODS data and CUBE data,I know how to do reconciliation of R/3 data and BW data.
Regards.
hariHi,
create a multicube based on your ODS and cube, identify commen characteristics and perform the key figure selections; create then a query showing both information from the cube and ODS with perhaps some formula showing differences between key figures.
hope this helps...
Olivier. -
How to change it if some data is wrong in ODS and infocube
hi all:
could you please tell me how to change it if some data is wrong in ODS and infocube ?
Best regardsYou receive information on all requests that have run in the InfoCube and you can delete requests if required.
http://help.sap.com/saphelp_nw04s/helpdata/en/d4/aa6437469e4f0ae10000009b38f8cf/frameset.htm
Request ID you can find in administration of the InfoCube. If a cubes is compressed you can't delete requestwise...
Regards
Andreas -
I added 0AMOUNT in generic data source and in rsa3 i am seeing the data ..b
i added 0AMOUNT in generic data source and in rsa3 i am seeing the data ..but i am not seeing any data in target table..
what would be the causeHi,
I guess you mean the target table in BW, correct?
First replicate your DSource in BW
Open your TRules. In tab Transfer structure/Datasource locate your field in the right pane (should be greyed, not blue); move it from to the left (to the transfer structure); reactivate and reload.
You should now see the field in your PSA table.
hope this helps...
Olivier. -
Generic Data Source and InfoPackage Data Selection
Hello, I'm trying to extract data from a generic data source (composed of a view joining 4 tables) using the Data Selection Criteria in an infopackage. Sometimes the selection criteria is ignored and more data is pulled than expected. The number of selectable items in the generic data source has been reduced (in case this was an issue) and for a short time the infopackaged pulled the expected number of records. Then another change was made to the generic data source and the problem returned. In any case, the transported generic data source does not work in the target system either. I'm not sure what is causing this to happen. Any ideas??? BW 3.5.
Regards,
Sue
Message was edited by: Sue and Enrique
Ramesh PHi,
Some of the datasources will not have default Infopackages for them.You can create the infopackages according to the requirement.There is no big deal for creating Infopackages.You can go ahead creating them.
Regards
Karthik -
Data from ODS and a Inventory cube
Hi Guys,
We have an Inventory Cube zic_c03, and ODS which has some detailed information like batch level information.
There is a MultiProvider on top of these two.
My question is can we use a cube on top of the ODS and feed that into the Multi Provider.
Will that improve performance on queries.
What are the drawbacks or repercussions I need to understand first.
Thanks a lot in advance for your help.Hi Murali,
Yes, if you load into a cube from your ODS you can improve performance. I typically load data that can be summarized into the cube and cut out document numbers, line items, etc... If you need the detail, then just jump to a query off of the ODS objects' MP.
The drawbacks are that you no longer get all of your detailed information at the cube level.
Thanks,
Brian
Maybe you are looking for
-
Issue Regarding PCR-Time Management
Hi All, The requirement is to not to consider break schedule assigned in work schedule for an attendance type calculation hours . For example if employee is taking this attendance 16 hours , now its deducting 0.5hr break .Payment has to be made for a
-
8600 plus with mac osX, need to delete email addresses from scanner
Have Mac osX snow leopard, need to remove/delete email addresses from my scanner list. Have tried to follow other posts to solve but they don't seem to help as I can't find the specific network folder. Thanks, Chuck
-
I was able to open a pdf document but was unable to save to my computer. I have adobe acrobat pro xi. the message i received was. "the document could not be saved. there was a problem reading the document (8)." I was able to convert it to a Word
-
Ongoing Desktop Manager Sync Setup problem
Hello, I am having the "Computer has to be connected to Internet" problem in trying to set up syncing for first time. I see this issue all over these boards as well as Crackberry and others. The first response is always referring to the vague and uni
-
Howto configure Seeburger AS2 Sender Adapter without using subject
Hello Experts, we're going to implement AS2 Communication with a partner who isn't able to send the XML message with a subject in AS2 HTTTP headerfield but a different dtAS2FileName per message. We have configured the scenario and it works as long as