Data load by Request ID!
Hi Friends,
I have a scenario. I want to load data from one InfoCube to DSO. I generated export data source for the InfoCube and created updated rules between InfoCube and DSO. I have 10 requests in InfoCube for which data was loaded. Now, I DO NOT want to load data belongs to all those 10 requests in infocube. Instead, i want to load data for a particular request ID. How can i do this?
<b>your help will be appreciated in terms of points.</b>
Thanks,
Manmit
Actually, thinking about it, there is an easier way.
1) Create a query on the infocube, having all the required fields (you need to be loaded into the ODS) in rows and columns.
2) Have a variable on 0REQUID in the global filter section of the query
3) Create a transactional ODS which should have all the fields present in the query.
4) Create an APD which will enable you to save the output of the above query into the transactional ODS.
5) Now create update rules between the transactional ODS and the ODS that needs to be loaded with data.
6) Execute the APD with by passing the request ID of interest to the query. The output will be saved into the transactional ODS and this ouput data is the data in the cube of the request ID selected during query execution.
7) Load from the transactional ODS to your original ODS. You can safely do a full load, coz the transactional data only has the data of interest to you.
Good luck!
Similar Messages
-
Data Load Issue "Request is in obsolete version of DataSource"
Hello,
I am getting a very strange data load issue in production, I am able to load the data upto PSA, but when I am running the DTP to load the data into 0EMPLOYEE ( Master data Object) getting bellow msg
Request REQU_1IGEUD6M8EZH8V65JTENZGQHD not extracted; request is in obsolete version of DataSource
The request REQU_1IGEUD6M8EZH8V65JTENZGQHD was loaded into the PSA table when the DataSource had a different structure to the current one. Incompatible changes have been made to the DataSource since then and the request cannot be extracted with the DTP anymore.
I have taken the follwoing action
1. Replicated the data source
2. Deleted all request from PSA
2. Activated the data source using (RSDS_DATASOURCE_ACTIVATE_ALL)
3. Re transported the datasource , transformation, DTP
Still getting the same issue
If you have any idea please reply asap.
SamitHi
Generate your datasource in R/3 then replicate and activate the transfer rules.
Regards,
Chandu. -
Data loaded but request is in red
Hi Friends,
i have an ODS and Cube , the data gets loaded into ODS initially and then into Cube through that ODS. every thing happens thorough process chain. yesterday the ODS data loaded and available for reporting but the request is in Red color . it has failed to load the data into further to cube. my doubt is if the data loaded and available for reporting but Red request will give any problem? . i have tried to change the QM status but it did not allow to change. please guide me on this.. max points will be awarded.
Thanks,
PrasadHi ...
I think that the request in ur ODS is partially activated.......bcoz QM status red .......but available for reporting only happens when request ........you cannot change the status manually......since it is partially activated...........
If the load is full upload........then delete the request from the target ..and reload it.........
But if it is a delta load........then change the QM status of the IP to red in RSMO.........In the target level......I mean in the Manage screen it will not allow you to change....for full upload also its better to change the QM status in red......then delete it and again reload it.........
Is there is any ODS activation step in ur PC.......if not check the settings of the ODS in RSA1........in the setting tab there is a check bob Activate Data Automatically.........it is checked or not......if it is checked it means after data get loaded in the ODS.....it will get activated automatically.................But when you are using a PC..........this is not a good practice..........its better to keep a seperate ODS activation step in the PC..........
Hope this helps......
Regards,
Debjani....... -
Is it possible to load a request already updated in the data target ?
Hi Experts,
Question: load a request which is already updated in data targets and data mart status is set
Data flow:
source system -
> ODS1 -
> Infocube1
Now, requests are first loaded into ODS1. Then these request are loaded into Infocube1.
Now, i deleted the request in the Infocube1.
Is it possible to reload the request from ODS1 to Infocube1 ?
Please suggest how.
Regards,
Suraj S NairHi Masthan Kaniganti ,
It is possible to load the data but not same request.
The reason why we can not load the request is in ODS level delta resords are updated ,hence with the selection condition we have to load the present and previous data.
and one more thing...
If you are deleted request in Cube level means, we can reconstuct the same request via reconstruction tab..
Hope this helps you...let me know if you any further ifo on this..
Best Regards,
maruthi -
Need help in deleting the data load request
Hi All,
In the system one erroneous data load load request is being generated( whoes status is monitor without Request), because of this I am unable to activate other requests...
I want to delete this request, I am deleting it but it is not, it is still remain in the system, and i am unable to activate it as it is an erroneous one, I have deleted that request from the backend table RSMONICDP as well, but still no successful results.
Please give ur suggestions on this.
Thanks,
vinayHi
There is a possible solution for these kind of issues .
1. Note that bad request number
2. Go to SE16
3. Open table RSODSACTREQ.: Filter the contents by that request number . There will be one entry for that request . Delete that entry
4. Open table RSICCONT. There will be one entry for that request . Delete that entry
5. Open Table RSMONICDP. Filter the contents by that request number . There will be 3-4 entries for that request depending on number of data packages in that request .Delete all those entries -
Sample SOAP request for Data Loader API
Hi
Can anyone please help me out in giving a sample SOAP request for Data Loader API .This is to say import 1K records from my system to the CRM instance I have .Log into the application and then click on Training and Support there is a WS Library of Information within the application
-
Can't Delete Data Load Request
Hi All,
During the initial data load there was some error and the data load in Manage ODS is shown as RED, also 12 records are seen as uploaded, which can be seen in New Data Tab. This request is also seen as available for reporting but with RED Light. Now, we want to delete this request but we can't, the reason is beyond our knowledge. Can some help how to delete this request, so that we can load data again. Below is wht we could get
Message type__________ E (Error)
Message class_________ RSMPC (Messages for Process Chains of Generic Processes)
We didn't use Process Chain for uploading the initial data
Thanks & Regards
VikasHi,
Go to the info package
select the "Sheduler" menu
Delete the initial upload data and try the initial load again
regards
Happy Tony -
Underlying RS Table :: Date & Time for Data Load Requests
Dear SAP BW Community,
In BW 3.5, does anybody know the underlying "RS" table where I can check to see the earliest date & time which a data target was loaded, by providing the data target's technical name in SE16 ?
Thanks!OK, I've found the timestamp the data load requests in the table RSMONICDP.
To get the earliest data load for infoCube FRED, I'm going to Oracle via SQL*Plus, as follows:
select to_char(min(TIMESTAMP)) from sapr3.RSMONICDP where ICUBE = 'FRED' ; -
Data load problem - BW and Source System on the same AS
Hi experts,
Im starting with BW (7.0) in a sandbox environment where BW and the source system are installed on the same server (same AS). The source system is the SRM (Supplier Relationship Management) 5.0.
BW is working on client 001 while SRM is on client 100 and I want to load data from the SRM into BW.
Ive configured the RFC connections and the BWREMOTE users with their corresponding profiles in both clients, added a SAP source system (named SRMCLNT100), installed SRM Business Content, replicated the data sources from this source system and everything worked fine.
Now I want to load data from SRM (client 100) into BW (client 001) using standard data sources and extractors. To do this, Ive created an InfoPackage in one standard metadata data source (with data, checked through RSA3 on client 100 source system). Ive started the data load process, but the monitor says that no Idocs arrived from the source system and keeps the status yellow forever.
Additional information:
<u><b>BW Monitor Status:</b></u>
Request still running
Diagnosis
No errors could be found. The current process has probably not finished yet.
System Response
The ALE inbox of the SAP BW is identical to the ALE outbox of the source system
and/or
the maximum wait time for this request has not yet run out
and/or
the batch job in the source system has not yet ended.
Current status
No Idocs arrived from the source system.
<b><u>BW Monitor Details:</u></b>
0 from 0 records
but there are 2 records on RSA3 for this data source
Overall status: Missing messages or warnings
- Requests (messages): Everything OK
o Data request arranged
o Confirmed with: OK
- Extraction (messages): Missing messages
o Missing message: Request received
o Missing message: Number of sent records
o Missing message: Selection completed
- Transfer (IDocs and TRFC): Missing messages or warnings
o Request IDoc: sent, not arrived ; Data passed to port OK
- Processing (data packet): No data
<b><u>Transactional RFC (sm58):</u></b>
Function Module: IDOC_INBOUND_ASYNCHRONOUS
Target System: SRMCLNT100
Date Time: 08.03.2006 14:55:56
Status text: No service for system SAPSRM, client 001 in Integration Directory
Transaction ID: C8C415C718DC440F1AAC064E
Host: srm
Program: SAPMSSY1
Client: 001
Rpts: 0000
<b><u>System Log (sm21):</u></b>
14:55:56 DIA 000 100 BWREMOTE D0 1 Transaction Canceled IDOC_ADAPTER 601 ( SAPSRM 001 )
Documentation for system log message D0 1 :
The transaction has been terminated. This may be caused by a termination message from the application (MESSAGE Axxx) or by an error detected by the SAP System due to which it makes no sense to proceed with the transaction. The actual reason for the termination is indicated by the T100 message and the parameters.
Additional documentation for message IDOC_ADAPTER 601 No service for system &1, client &2 in Integration Directory No documentation exists for message ID601
<b><u>RFC Destinations (sm59):</u></b>
Both RFC destinations look fine, with connection and authorization tests successful.
<b><u>RFC Users (su01):</u></b>
BW: BWREMOTE with profile S_BI-WHM_RFC (plus SAP_ALL and SAP_NEW temporarily)
Source System: BWREMOTE with profile S_BI-WX_RFCA (plus SAP_ALL and SAP_NEW temporarily)
Someone could help ?
Thanks,
GuilhermeGuilherme
I didn't see any reason why it's not bringing. Are you doing full extraction or Delta. If delta extraction please check the extractor is delta enabled or not. Some times this may cause problems.
Also check this weblog on data Load errors basic checks. it may help
/people/siegfried.szameitat/blog/2005/07/28/data-load-errors--basic-checks
Thanks
Sat -
How can we view the data asociated with Request ID
Hi All,
I have data loaded in an ODS there are multiple files loaded in this ODS. Now i want to view data for a perticuler request OR i want to know how can we know the data associated with a perticular request.
Please help me out.
Thanks & Regards
Amit KumarCopy the request number (not request ID) from the ODS manage screen, Goto RSRQ give the request number and execute. it shows the monitor screen- now goto PSA from the top and check the data. But this possible only if you are loading data to PSA also along with ODS.
Otherwise, Goto Display Data of ODS from the right click on it -- in the selection screen give the Request ID that you want to check the data. Then it displays only that request data.
Hope this helps.
Veerendra.
Edited by: denduluri veerendra kumar on Dec 1, 2009 12:40 PM
Edited by: denduluri veerendra kumar on Dec 1, 2009 12:43 PM -
Data load stuck from DSO to Master data Infoobject
Hello Experts,
We have this issue where data load is stuck between a DSO and master data infoobject
Data uploads from DSO( std) to master data infoobject.
This Infoobject has display and nav attributes in it which are mapped from DSO to Infoobject.
Now we have added a new infoobject as attribute to the master data infoobject and made it as NAV attri.
Now when we are doing full load via DTP the load is stuck and is not processing.
Earlier it took only 5 mns of time to complete the full load.
Please advise what could be the reason and cause behind this.
Regards,
santhosh.Hello guys,
Thanks for the quick response.
But its nothing proceeding further.
The request is still running.
earlier this same data is loaded in 5 mns.
Please find the screen shot.
master data for the infoobjects are loaded as well.
I can see in SM50 the process at P table of the infoobject the process is.
Please advise.
Please find the detials
Updating attributes for InfoObject YCVGUID
Start of Master Data Update
Check Duplicate Key Values
Check Data Values
Process time dependent attributes- green.
No Message: Process Time-Dependent Attributes- yellow
No Message: Generates Navigation Data- yellow
No Message: Update Master Data Attributes - yellow
No Message: End of Master Data Update - yellow
and nothing is going further in Sm37
Thanks,
Santhosh. -
Data load from DSO to cube fails
Hi Gurus,
The data loads have failed last night and when I try to dig inside the process chains I find out that the cube which should be loaded from DSO is not getting loaded.
The DSO has been loaded without errors from ECC.
Error Message say u201DThe unit/currency u2018source currency 0currencyu2019 with the value u2018spaceu2019 is assigned to the keyfigure u2018source keyfigure.ZARAMTu2019 u201D
I looked in the PSA has about 50, 000 records .
and all data packages have green light and all amounts have 0currency assigned
I went in to the DTP and looked at the error stack it has nothing in it then I changed the error handling option from u2018 no update,no reportingu2019 to u2018valid records update, no reporting (resquest red)u2019 and executed then the error stactk showed 101 records
The ZARAMT filed has 0currency blank for all these records
I tried to assign USD to them and the changes were saved and I tried to execute again but then the message says that the request ID should be repaired or deleted before the execution. I tried to repair it says can not be repaired so I deleted it and executed . it fails and the error stack still show 101 records.when I look at the records the changes made do not exist anymore.
If I delete the request ID before changing and try to save the changes then they donu2019t get saved.
What should I do to resolve the issue.
thanks
PrasadHi Prasad....
Error Stack is request specific.....Once you delete the request from the target the data in error stach will also get deleted....
Actually....in this case what you are suppose to do is :
1) Change the error handling option to u2018valid records update, no reporting (resquest red)u2019 (As you have already done....) and execute the DTP......all the erroneous records will get sccumulated in the error stack...
2) Then correct the erroneos records in the Error Stack..
3) Then in the DTP----in the "Update tab"....you will get the option "Error DTP".......if it is already not created you will get the option as "Create Error DT".............click there and execute the Error DTP..........The Error DTP will fetch the records from the Error Stack and will create a new request in the target....
4) Then manually change the status of the original request to Green....
But did you check why the value of this field is blank....if these records again come as apart of delta or full then your load will again fail.....chcek in the source system and fix it...for permanent solution..
Regards,
Debjani...... -
Hello friends
I am facing a problem in the data load. I modified a cube by adding few Characteristics. The characteristics were first added in the communication structure and in the transfer rules. Then I reactivated the update routine. Finally, I deleted any previous data load request for the cube and did a full load. However i wasn't able to find any data for the newly added fields in the Cube.
Did I miss something. Any help will be appreciated in this regard.
Thanks
Rishihow come ODS came in to picture,this was not mentioned in ur prev post,are u loading it from ODS to CUbe and having problems??
looks u are not using DTP ,in that case check the change log for the newly added fields and then have data flow as ODS>PSA>Cube and check the PSA if those fields are present ..if yes check for update rules in debugging mode of those gets deleted
Hope it Helps
Chetan
@CP.. -
Hi guys...
Suppose I have two Datasources that are mapped to a infosource and this infosource is mapped to one dso(all objects until DSO are emulated from 3.x to 7.x)...when I load data,I assume that I have to use two infopackages and I get data into DSO in two requests.I have few questions about this,assuming I have only these two requests in my DSO:
1.When I tried to create a query directly on DSO in query designer... I couldnot find the infoobject 0REQUESTID in query designer...then how can I do if I want to see data request by request rather than all together?
2.Suppose the DSO gets data like below:
Fields in DSO:X1,X2,Y1,Y2,Y3 [X1,X2 are characteristics and also keys,Y1,Y2,Y3 are keyfigures]
Data feeded by Datasource 1 : X1 X2 Y1
a b 10
Data feeded by Datasource 2 : X1 X2 Y2 Y3
a b 20 30
so when I load data,I will load data in two requests and these are the only two requests I have in my DSO....then how will data look in DSO.....does it gets stored in two seperate rows or single row?how is it shown in a query result?
If the keys are not matched,how will the data be shown for keyfigures that are not loaded by that request?
3.I know that in DSO,We have two options:Overwrite/Addition....how will be the data loading be in following situation:
Datasource 1 feeds like this in Request 1:
X1 X2 Y1
a b 10
Datasource 2 feeds like this in Request 2:
X1 X2 Y1 Y2 Y3
a b 30 40 50
how will the result be shown in our two options Addition and Overwrite?will request 2 overwrite or add up data in Y1?
Thanks.Hi guys...
Suppose I have two Datasources that are mapped to a infosource and this infosource is mapped to one dso(all objects until DSO are emulated from 3.x to 7.x)...when I load data,I assume that I have to use two infopackages and I get data into DSO in two requests.I have few questions about this,assuming I have only these two requests in my DSO:
1.When I tried to create a query directly on DSO in query designer... I couldnot find the infoobject 0REQUESTID in query designer...then how can I do if I want to see data request by request rather than all together?
Request-ID is only a part of the new data table - after activation of your data your request will get lost. If you want to see whats happening, load you data request by request and activate your data after each request
2.Suppose the DSO gets data like below:
Fields in DSO:X1,X2,Y1,Y2,Y3 X1,X2 are characteristics and also keys,Y1,Y2,Y3 are keyfigures
Data feeded by Datasource 1 : X1 X2 Y1
a b 10
Data feeded by Datasource 2 : X1 X2 Y2 Y3
a b 20 30
so when I load data,I will load data in two requests and these are the only two requests I have in my DSO....then how will data look in DSO.....does it gets stored in two seperate rows or single row?how is it shown in a query result?
If the keys are equal, you will have only one dataset in your DSO
If the keys are not matched,how will the data be shown for keyfigures that are not loaded by that request?
Then you will have two datasets in your DSO
3.I know that in DSO,We have two options:Overwrite/Addition....how will be the data loading be in following situation:
Datasource 1 feeds like this in Request 1:
X1 X2 Y1
a b 10
Datasource 2 feeds like this in Request 2:
X1 X2 Y1 Y2 Y3
a b 30 40 50
how will the result be shown in our two options Addition and Overwrite?will request 2 overwrite or add up data in Y1?
If you choose overwrite, you will get 30 - if you choose addition, you will get 40
Thanks. -
Partial data loading from dso to cube
Dear All,
I am loading the data from dso to the cube through DTP. The problem is that some records are getting added through data
packet 1 to the cube around 50 crore records while through data packet2 records are not getting added.
It is full load from dso to cube.
I have tried deleting the request and executing DTP but again the same no of records get added to the cube through datapacket 1 and after that records are not added to the cube through datapacket 1 ;request remains in yellow state only.
Please suggest .Nidhuk,
Data load transfers package by package. Your story sounds like it got stuck in second package or something. Suggest you check package size, try to increase to a higher number to see if anything changes ? 50 records per package kind of low, your load should not spread out into too many packages.
Regards. Jen
Edited by: Jen Yakimoto on Oct 8, 2010 1:45 AM
Maybe you are looking for
-
Info record mandatory for shopping cart creation in SRM classic scneario
Hi Gurus, We are creating shopping cart in our SRM 7.01 system in classic mode. We have not created any info record for the product category and vendor in the backend ECC system. Do we need create an info record for the particular combination of mate
-
Publishing cancels with "access denied" error when publishing updates to desktop layout
Hello, all, Wondering if anyone else has run into this issue: we update and republish our help projects frequently. We're currently using a layout based on the desktop layout (colors, fonts, and some other screen elements have changed). The project g
-
The aforementioned fix actually did not solve my issue on 10.9.1 using Server 3.0.2. Any other suggestions? I also see that the VPN service cannot be shut down. After sliding to off, it gives no message and automatically slides to On again...
-
Integrating NetWeaver Portal and Interwoven's TeamSite
Hi All, I have to integrate NetWeaver with Interwoven Web Content Management tool named TeamSite. Can somedody pass some information for the integration of both(NetWeaver Portal and Interwoven's TeamSite). Regards, Srikanth
-
Numbers - Getting re-downloaded even after installing
After upgrading to Maverick, latest versions of Numbers, Pages, Keynote were downloaded on my MacbookPro. I deleted old versions but app store keep showing update of Numbers. I can even see Pages, Keynote with Resume button on Purchases tab. How to f