Regarding transfering the data into different cubes
Hi all,
The relation between infosource and cube is one to many.So by applying update rules we can transfer the the data from infosource to different cubes.So please explain a scenario where in same data is stored in different cubes
thanks in advance
regards
karna
Hi
First you wont get any scenario where you have to store the same data in different cubes. There is no point of storing same data in multiple cubes(Wasting DB space) . The one to many scenario could be applied as USERPV said, you have data from a single source system which contains multiple country sales info. Now you want to divide this data based on countries and store in respective cubes, you have to write ABAP code in transfer rule to do this. You can report individual cubes or you can report on all cubes once using multiprovider.
Hope this will clear your doubt.
regards
vin
Similar Messages
-
Can we send the data into different data target from single datasource how
Hai
can we send the data into different data target from single datasource how ?Hi,
Create the transformation for each target and connect through DTP if you are in BI 7.0
If it is BW 3.5 create transfer rules and load it Info source and then different update rules for different targets then load it using IP.
If you are speaking about loading data from R3 data source to multiple data sources in BI
Then follow the below step.
1)create init IP's and run the IP's with different selection to different DSO(With same selection it is not possible).
2)Then you will have different delta queues for the same data source in RSA7
3)Your delta loads will run fine to different data sources from same DS in R3.
Hope this helps
Regards,
Venkatesh -
Regarding loading the data into ODS
Hi all,
I am having a sitaution where I had filled an ODS with data. Now few fields have to be added to the ODS for reporting purpose. I have added the fields. But I am having doubt in how to fill those fields alone in that ODS so that the data can be represented in the Reports on that ODS. Is there any prerequits and precautions that have to be taken????
Regards
YJHi,
Just write a prog and execute it to fill the added field.
ex: if sy-subrc <> 0.
message s000 with 'No records selected for the specified criteria.'.
else.
loop at int_tab.
update /bic/aODS00 set
added_field= int_tab-added_field
where condition.
if sy-subrc = 0.
counter = counter + 1.
endif.
endloop. -
We want to set up a delta refresh from R/3 data that will pull data into two cubes. One cube I want all the records to be loaded, in the second cube I want to filter the records that are loaded into that cube. I can figure out how to load the data into two cubes, but can I place a filter on one of the loads. (i.e. only load records of where type = 'X')
Thanks,
ChrisYou can do that in the Update Rules to the second cube... In the Start Routine...
DELETE FROM DATA_PACKAGE
WHERE type EQ 'X'.
(Please, verify the right syntax)
Then with only one Infopackage, you can load both cube with different conditions.
Hope it helps.
Regards,
Luis -
Error While loading the data into PSA
Hi Experts,
I have already loaded the data into my cube. But it doesn't have values for some fields. So again i modified the data in the flat file and tried to load them into PSA. But while starting the Info Package, I got an error saying that,
"Check Load from InfoSource
Created YOKYY on 20.02.2008 12:44:33
Check Load from InfoSource , Packet IP_DS_C11
Please execute the mail for additional information.
Error message from the source system
Diagnosis
An error occurred in the source system.
System Response
Caller 09 contains an error message.
Further analysis:
The error occurred in Extractor .
Refer to the error message.
Procedure
How you remove the error depends on the error message.
Note
If the source system is a Client Workstation, then it is possible that the file that you wanted to load was being edited at the time of the data request. Make sure that the file is in the specified directory, that it is not being processed at the moment, and restart the request.
Pls help me in this......
With Regards,
Yokesh.Hi,
After editing the file did you save the file and close it.
This error may come if your file was open at the time of request.
Also did you check the file path settings.
If everything is correct try saving the infopackage once and loading again.
Thanks,
JituK -
Regarding Reloading of Data Into The Cube
Hi All,
I have a situation where I have to reload the data into the InfoCube. The InfoCube is already having Full Load and Deltas for a period of 1 month. Now if I do a full load will I get the data as of today and can I do deltas tomorrow onwards after doing an Init???
Should I have to delete the Data from Delta Queue in Source System (RSA7)???
Please clarify me.
Thanks in advance.
Regards
JayHi,
if you already have intialisations on the cube you can do a repair ful load. which will bring all the data and you can continue using the delta functionality from tomorrow.
hope this helps.
Ravi. -
Regarding Loading the data from DSO to cube.
Hello Experts,
I have DSO which loads data from psa using 7.0 tranformation (using DTP). And i have a cube which loads the data from that DSO using 3.x transfer rules. Now i have deleted the init request for a infopack before load the data into DSO. But when i load the data from DSO to cube by right click on DSO -> click On Additional Functions -> update the 3.x data to targets, It is giving me an error like 'Delete init. request REQU_4H7UY4ZXAO72WR4GTUIW0XUKP before running init. again with same selection
Please help me with this.
i want to load the data in the init request to cube..
ThanksHi Shanthi,
Thanks For reply. I have already deleted the init request from source system to DSO and then i have tried still i am getting the error.
Thanks -
Hi ,
We have a catalog that defines 2 types of products (they have too many different properties), so wanted to keep them on two different MDEX engines and serve the applications requests. Here DB catalog and front end ATG application is same for both the MDEX instances.
Is it possible to have 2 different output config XML files and index the data into 2 endeca apps using the same indexing component ProductCatalogSimpleIndexingAdmin?
Thanks
DevHi, also have had some problem some monthes ago - I created separete component ProductCatalogSimpleIndexingAdminSecond. After that one of my colleage gave me some advice:
The creating separate component like ProductCatalogSimpleIndexingAdmin for the second IOC is possible way for resolving your situation. But I afraid that this way will be required creating mane duplicates for already existed components.
In my opinion the better way is the following:
starting from AssemblerApplicationConfiguration and ApplicationConfiguration component. It contains details for connecting between ATG and Endeca. Of course you should configure different components for different Endeca Apps.
After that:
Find all components that uses AssemblerApplicationConfiguration and ApplicationConfiguration. Customize these components for using one or another *Configuration component depending on what index works. (many variants released it: the most simple global custom component with flag.)
Then customize the existed ProductCatalogSimpleIndexingAdmin. Using one or another IOC and setting the flag in global custom component when index started. You can add some methods into your custom ProductCatalogSimpleIndexingAdmin like:
Execute baseline index for both IOC (one by one)
Execute baseline for IOC 1
Execute baseline for IOC 2.
Note: you should be afraid about incremental (partial) index in this configuration. But resolving conflicts in incremental index should be done after full implementation these changes.
Regards -
Unable to Loda the data into PSA.
Hi Xpert,
i am uanble to load the data into PSA.
my source system is not R/3,it is BI.
(ctually we are xtracting the data (from a cube) with the help of programm to a table and then i am creating a generic data ource on that table and loading the data to my cube.)
i am getting this error message.
Error message from the source system
Diagnosis
An error occurred in the source system.
System Response
Caller 09 contains an error message.
Further analysis:
The error occurred in Extractor .
Refer to the error message.
Procedure
How you remove the error depends on the error message.
Note
If the source system is a Client Workstation, then it is possible that the file that you wanted to load was being edited at the time of the data request. Make sure that the file is in the specified directory, that it is not being processed at the moment, and restart the
Then i replicated the data source and activate ti also then also i am getting this error message.
when i am checking my data source on RSA3 i am getting this error message,
Two internal tables are neither compatible nor convertible.
t happened?
Error in the ABAP Application Program
The current ABAP program "SAPLAQBWEXR" had to be terminated because it has
come across a statement that unfortunately cannot be executed.
or analysis
You attempted to move one data object to another.
This is not possible here because the internal tables concerned
are neither compatible nor convertible.
gger Location of Runtime Error
Program SAPLAQBWEXR
Include LAQBWEXRU01
Row 419
Module type (FUNCTION)
Module Name AQBW_CALL_EXTRACTOR_QUERY
Regards,
sat534Hi
Problem looks to be with generic datasource
share details of data source and how you created it.
Regards
Sudeep -
How to create a procedure in oracle to write the data into file
Hi All,
I am just wondered on how to create a procedure which will do following tasks:
1. Concat the field names
2. Union all the particular fields
3. Convert the date field into IST
4. Prepare the statement
5. write the data into a file
Basically what I am trying to achieve is to convert one mysql proc to oracle. MySQL Proc is as follows:
DELIMITER $$
USE `jioworld`$$
DROP PROCEDURE IF EXISTS `usersReport`$$
CREATE DEFINER=`root`@`%` PROCEDURE `usersReport`(IN pathFile VARCHAR(255),IN startDate TIMESTAMP,IN endDate TIMESTAMP )
BEGIN
SET @a= CONCAT("(SELECT 'User ID','Account ID','Gender','Birthdate','Account Registered On') UNION ALL (SELECT IFNULL(a.riluid,''),IFNULL(a.rilaccountid,''),IFNULL(a.gender,''),IFNULL(a.birthdate,''),IFNULL(CONVERT_TZ(a.creationDate,'+0:00','+5:30'),'') INTO OUTFILE '",pathFile,"' FIELDS TERMINATED BY ',' OPTIONALLY ENCLOSED BY '' LINES TERMINATED BY '\n' FROM account_ a where a.creationDate>='",startDate,"' and a.creationdate <='",endDate,"')");
PREPARE stmt FROM @a;
EXECUTE stmt;
DEALLOCATE PREPARE stmt ;
END$$
DELIMITER ;
Regards,
Vishal G1. Concat the field names
Double Pipe (||) is the concatenation operator in Oracle. There is also a function CONCAT for this purpose
2. Union all the particular fields
Not sure what do you mean by UNION ALL particular fields? UNION ALL is a set operation applied on two different result sets that have the same projection.
3. Convert the date field into IST
SQL> select systimestamp "Default Time"
2 , systimestamp at time zone 'Asia/Calcutta' "IST Time"
3 from dual;
Default Time IST Time
05-05-15 03:14:52.346099 AM -04:00 05-05-15 12:44:52.346099 PM ASIA/CALCUTTA
4. Prepare the statement
What do you mean by prepare the statement?
5. write the data into a file
You can use the API UTL_FILE to write to a file. -
Error while activating the data into DSO
Hi
My base DSO is used to load 4 other data targets.
In process chain, after the base DSO gets activated there are 4 DTPu2019s running to load the data from base DSO to other DSO and 3 cubes.
When loading to other DSO, We have encountered an error
Object is currently locked by BI Remote
Lock not set for : Activating data in DSO
Activation of M records terminated.
1. My question is when loading the data from base DSO to other objects , how does the lock mechanism works.
I know that we cannot load the data into base DSO, when base DSO is sending data into target.
2. What difference does it make when loading DSO to DSO and cube parallel?
Thanks
AnnieHi Annie.....
1. My question is when loading the data from base DSO to other objects , how does the lock mechanism works.
I know that we cannot load the data into base DSO, when base DSO is sending data into target.
Do you mean to say that the loading in the 2nd level DSO was successful .....but the activation failed ?
Have you checked in SM12 that whether that 2nd level DSO is somehow locked or not ?
Is any further targets getting loaded from this 2nd level DSO ?
Look suppose u r loading a DSO A.........and in the mean time some load starts from DSO A to some other target(it may be DSO or a cube).........then the activation in the DSO A will fail........because since the last request in the DSO A is not activated....that request will not get considered in the subsequent load....and since the load is already in progress....system will not allow to activate any new request......
Another option can be that DSO A is getting loaded from some other targets as well.......so since still some load is in progress in this target....it will not allow the activation....
So check it and atart the activation again..
2. What difference does it make when loading DSO to DSO and cube parallel?
The main difference is that there is no activation concept in the cube....so a cube may get loaded from several targets in parallel......
A DSO can also get loaded in parallel.......but activation should start once all the loads get completed successfully.....
Regards,
Debjani.... -
How to write plan data into Transactional Cube from Visual Composer ?
Dear experts,
Visual Composer is very powerfull to build a 'nice-looking' web-report (I think is much better than Web Application Designer in term of freely position the chart/graph/table/selection parameter in anywhere we like. We can't have this flexibility in WAD) plus it also has strong integration to R/3 for posting the transactions via RFC-Enabled Function Module (such as changing customer credit limit, approve PO, approve Sales Order, etc), so that the R/3 user won't even have to know the TrxCode.
In addition to this, I just wondering if VC also capable to write plan data into transactional cube via Write-Enable Query. We can simply insert a BW Query into VC as planning layout (set the edit-mode to "editable") plus having some Wizards to make it more user-friendly and self-explanatory for the users.
But I am not sure how to save the changes back to transactional cube from VC.
Could anyone please kindly advise how to achieve this.
"Seeing is believing" is always my learning method, so I would appreciate some "actioanable" explanation or hints.
Many thanks,
SenHi Sen,
As far as I know, it's not possible to write data into transactional cubes directly from a table in Visual Composer through standar method, but it's possible to integrate a WAD with your input ready query into a URL element in VC (You can fill entry variables in VC and call WAD dinamically), so it's possible to create a planning application in VC calling dinamically WAD's.
Another way to interact through VC is creating button objects that calls planning sequences (with BAPI calling RSPLS_PLSEQ_EXECUTE). Also can fill variables in VC and pass them to sequence.
Regards,
Enrique -
Using bapi how to upload the data into sap database?
hi dear all,
im facing problem with bapi ? let me edcuate on bapi ..
i will be waiting for reply.
my e-id :[email protected]
thanks®ards
shiva.Hi
A BAPI is a method of a SAP Business Object. BAPI enables SAP and third party applications to interact and integrate
with each other at the Business Object / Process level.
Check this link to know more about BAPI.
http://www.sapgenie.com/abap/bapi/example.htm
http://sappoint.com/abap/
Batch Data Communication (BDC) is the oldest batch interfacing technique that SAP provided since the early versions of R/3. BDC is not a
typical integration tool, in the sense that, it can be only be used for uploading data into R/3 and so it is not bi-directional.
BDC works on the principle of simulating user input for transactional screen, via an ABAP program. Typically the input comes in the form
of a flat file. The ABAP program reads this file and formats the input data screen by screen into an internal table (BDCDATA). The
transaction is then started using this internal table as the input and executed in the background.
In Call Transaction, the transactions are triggered at the time of processing itself and so the ABAP program must do the error handling.
It can also be used for real-time interfaces and custom error handling & logging features. .
To know more about BDC,
check the link.
http://sappoint.com/abap/
Main differences are...
In case of bdc data transfer takes place from flat file into sap system ie the file existing in sap system to sap sytem
where is bapi's r remotly enabled function modules which are assigned to some business objects n used to transfer the data between different business partners who are using different systems other than sap.
not only that...
when you plan to upgrade your system version then bdc willnot support those upgradations where as bapi's will support.
http://www.sap-img.com/abap/ale-bapi.htm
SAP BAPI
BAPI STEPS
Hope this helps.
ashish -
Loading data into Fact/Cube with surrogate keys from SCD2
We have created 2 dimensions, CUSTOMER & PRODUCT with surrogate keys to reflect SCD Type 2.
We now have the transactional data that we need to load.
The data has a customer id that relates to the natural key of the customer dimension and a product id that relates to the natural key of the product dimension.
Can anyone point us in the direction of some documentation that explains the steps necessary to populate our fact table with the appropriate surrgoate key?
We assume that we need to have an lookup table between the current version of the customer and the incoming transaction data - but not sure how to go about this.
Thanks in advance for your help.
LauraHi Laura
There is another way to handling SCD and changing Facts. This is to use a different table for the history. Let me explain.
The standard approach has these three steps:
1. Determine if a change has occurred
2. End Date the existing record
3. Insert a new record into the same table with a new Start Date and dummy End Date, using a new surrogate key
The modified approach also has three steps:
1. Determine if a change has occurred
2. Copy the existing record to a history table, setting the appropriate End Date en route
3. Update the existing record with the changed information giving the record a new Start Date, but retaining the original surrogate key
What else do you need to do?
The current table can use the surrogate key as the primary key with the natural key being being a unique key. The history table has the surrogate key and the end date in the primary key, with a unique key on the natural key and the end date. For end user queries which in more than 90% of the time go against current data this method is much faster because only current records are in the main table and no filters are needed on dates. If a user wants to query history and current combined then a view which uses a union of the main and historical data can be used. One more thing to note is that if you adopt this approach for your dimension tables then they always keep the same surrogate key for the index. This means that if you follow a strict Kimball approach to make the primary key of the fact table be a composite key made up of the foreign keys from each dimension, you NEVER have to rework this primary key. It always points to the correct dimension, thereby eliminating the need for a surrogate key on the fact table!
I am using this technique to great effect in my current contract and performance is excellent. The splitter at the end of the map splits the data into three sets. Set one is for an insert into the main table when there is no match on the natural key. Set two is when there is a match on the natural key and the delta comparison has determined that a change occurred. In this case the current row needs to be copied into history, setting the End Date to the system date en route. Set three is also when there is a match on the natural key and the delta comparison has determined that a change occurred. In this case the main record is simply updated with the Start Date being reset to the system date.
By the way, I intend to put a white paper together on this approach if anyone is interested.
Hope this helps
Regards
Michael -
Export transnational data into different client
Dear All Expertise's,
For our Management Audit purpose we have needed urgently copy all FI transnational data into different client in particular year.
As an example:
Source Client: 888
Target client: 700
Company: 1000
Year: 2013
FI Documents type: All
Please help me to get the solution use form your great advice.
SCC8/SCC7 can give us copy all records client to client. But we need only specific company/periods.
Thanks and Best regards
Bishnu
05.05I post the VI. Thanks!
Attachments:
j k thermocouple.vi 151 KB
Maybe you are looking for
-
I have a mac mini that I use my itunes on. Purchased content shows up fine on the itunes library, movies, tv shows and music. I have linked my apple tv to it. When I check my computer library via the apple tv many of the movies, music and tv shows
-
As subject. How to solve the issue for the user who has the Creative Cloud Complete account? The admin portal has her account indicates "Active". I did chat with Customer Support, but she transferred me to Technical Support team, then the chat was dr
-
Why can't I open a new tab?
Hello. This has happened reacently. I am clicking on the '+' symbol on the tab bar and no tab comes up. I have no more information, it is as simple as that.
-
How does session bean contain in the JSP page
May i know how session bean contains in the JSP page If the situation is like the following ... if i dont want to put the session bean in the session like this : session.setAttribute("UserBean",userBean); the way i get the session bean is only sessio
-
Bluetooth isn't installing properly on Satellite L775 on Win7 64bit(Sp1)
I have just installed windows 7 sp1 64bit edition on L775 but when i am trying to install bluetooth stack driver for this laptop a error came during installation device drivers are not successfully installed and in device manager its appearing under