Problem extracting LIS data
Hi,
We are having problem extracting LIS data.
I have cheked the delta queue (RSA7), and 2LIS_13_VDITM has total 0.
I have also checked transaction LBWQ, and can see that the MCEX13 has got 150028 entries and MCEX11 has got 12811 entries in the queue. It has destination NONE and status NOSEND.
What can I do to transfer all these entries to the delta queue so that they can be loaded to BW?
BR,
Linda
Hi Linda,
search form RMBWV in the ABAP input box (job name = *, user = *)
goto SMQ1 in the source system; you'll see LUWs for the MCEX* queue.
in this tran you also see your BW delta queue; do you see LUWs as well for your datasource?
you can schedule the update job via LBWE / job control; once executed, the LUWs for MCEX* should decrease and your BW LUWs should incerease...
let us know
hope this helps
Olivier.
Message was edited by:
Olivier Cora
Similar Messages
-
Request: Problem in Extraction of data from CRM to BW
Hi Gurus
I have problems in extracting the data from CRM to BW.I thought you are the one who can solve them:
1) When I check the data in the datasource in RSA3 i can see all records as well as i can see complete data in that but when i extract and check the data in cube i can see all records but i cannot see the complete data i.e some data is missing.
2)And another problem is that In report i have 2 characteristics(DATS) start date and end date now i am trying to get number of days from start date to end date.
I appreciate you in advance. If u find any good documentation please send me to [email protected]
With regards
Nagamani.Hi krishna,
yes we did enhancement to this data source. In PRDOCTION its taking 27 hours time to load data.
For extraction from source system (CRM) its taking nearly 24 hours time as per JOb LOG .
08/05/2010 11:53:08 Call customer enhancement EXIT_SAPLRSAP_001 (CMOD) with 10,849 records
08/05/2010 12:02:02 Result of customer enhancement: 10,849 records
08/05/2010 12:02:03 PSA=0 USING & STARTING SAPI SCHEDULER
08/05/2010 12:02:03 Asynchronous send of data package 1 in task 0002 (1 parallel tasks)
08/05/2010 12:02:06 IDOC: Info IDoc 2, IDoc No. 1576298, Duration 00:00:01
08/05/2010 12:02:06 IDoc: Start = 08/05/2010 10:26:37, End = 08/05/2010 10:26:38
08/05/2010 13:02:38 Call customer enhancement EXIT_SAPLRSAP_001 (CMOD) with 10,958 records
08/05/2010 13:11:48 Result of customer enhancement: 10,958 records
08/05/2010 13:11:52 Asynchronous send of data package 2 in task 0003 (1 parallel tasks)
08/05/2010 13:11:53 tRFC: Data Package = 1, TID = 0AA00D0301984C5AEE8E44DB, Duration = 00:16:
08/05/2010 13:11:53 tRFC: Start = 08/05/2010 12:02:19, End = 08/05/2010 12:18:27
08/05/2010 14:30:13 Call customer enhancement EXIT_SAPLRSAP_001 (CMOD) with 11,296 records
08/05/2010 14:39:53 Result of customer enhancement: 11,296 records
like this its taking 24 hours in extraction its self.
Edited by: kotha123 on Aug 10, 2010 12:08 PM
Edited by: kotha123 on Aug 10, 2010 12:09 PM -
Problem extracting data using 0EC_PCA_3 datasource
Hello every body.
Hope every thing is all right for you.
I'm facing an embarassing problem, and hope you can help!
I'm using datasource 0EC_PC_3 to extract data from GLPCA (ECC table) into a BI DSO. My probleme is that when i try to extract all data for a specific profit center group (or customer) i'm not getting data related to fiscal period biger than 10. When i add a filter on account number (on infopackage selection criteria) for example it's getting all data with no probleme in fiscal period!!
Any body has faced this error or something like it?
Thanks for help!
SalahThanks every body for reply,
yes i've fiscal period as part of my DSO key part!
At datasource level (0EC_PCA_3), i'm not getting data as it should be !!!
When, whitin my infopackage, i specify the profit center, data are retreived correctly, otherwise i'm not getting all data!!!
Any idea? :s -
Problem with Text data Extraction from R/3
Hi Experts,
The problem is extracting Text data from R/3 system.
Actually its an delta load.The load is running fine....but it is not bringing any data from the source system (R/3)0 Record.When I checked the table in R/3 system the data are there.
I tried with Fullupdate too, still its bring 0 record.
What will be the solution for this?
Thanks in Advance!
Regds,
SPSHello SPS, I cant remember very well, but when we found this issue, the problem was Language key. There was a problem with standard datasource.
Did you try to find any notes for your datasource ?
Hugs,
Bueno -
Performance problem whlile selecting(extracting the data)
i have one intermediate table.
iam inserting the rows which are derived from a select statement
The select statement having a where clause which joins a view (created by 5 tables)
The problem is select statement which is getting the data is taking more time
i identified the problems like this
1) The view which is using in the select statement is not indexed---is index is necessary on view ????
2) Because the tables which are used to create a view have already properly indexed
3) while extracting the data it is taking the more time
the below query will extract the data and insert the data in the intermediate table
SELECT 1414 report_time,
2 dt_q,
1 hirearchy_no_q,
p.unique_security_c,
p.source_code_c,
p.customer_specific_security_c user_security_c,
p.par_value par_value, exchange_code_c,
(CASE WHEN p.ASK_PRICE_L IS NOT NULL THEN 1
WHEN p.BID_PRICE_L IS NOT NULL THEN 1
WHEN p.STRIKE_PRICE_L IS NOT NULL THEN 1
WHEN p.VALUATION_PRICE_L IS NOT NULL THEN 1 ELSE 0 END) bill_status,
p.CLASS_C AS CLASS,
p.SUBCLASS_C AS SUBCLASS,
p.AGENT_ADDRESS_LINE1_T AS AGENTADDRESSLINE1,
p.AGENT_ADDRESS_LINE2_T AS AGENTADDRESSLINE2,
p.AGENT_CODE1_T AS AGENTCODE1,
p.AGENT_CODE2_T AS AGENTCODE2,
p.AGENT_NAME_LINE1_T AS AGENTNAMELINE1,
p.AGENT_NAME_LINE2_T AS AGENTNAMELINE2,
p.ASK_PRICE_L AS ASKPRICE,
p.ASK_PRICE_DATE_D AS ASKPRICEDATE,
p.ASSET_CLASS_T AS ASSETCLASS
FROM (SELECT
DISTINCT x.*,m.customer_specific_security_c,m.par_value
FROM
HOLDING_M m JOIN ED_DVTKQS_V x ON
m.unique_security_c = x.unique_security_c AND
m.customer_c = 'CONF100005' AND
m.portfolio_c = 24 AND
m.status_c = 1
WHERE exists
(SELECT 1 FROM ED_DVTKQS_V y
WHERE x.unique_security_c = y.unique_security_c
GROUP BY y.unique_security_c
HAVING MAX(y.trading_volume_l) = x.trading_volume_l)) p
any one please give me the valueble suggestions on the performancethanks for the updating
in the select query we used some functions like max
(SELECT 1 FROM ED_DVTKQS_V y
WHERE x.unique_security_c = y.unique_security_c
GROUP BY y.unique_security_c
HAVING MAX(y.trading_volume_l) = x.trading_volume_l)) p
will these type of functions will cause the performance problem ??? -
Delta Problem while extracting the data by using function module
Hi Experts,
I have extracted the data by using Function module, for delta loads I have given Calday and additive delta in Generic delta settings, in BW side I have initialized the data after that I have given delta update option in infopackage and I made process chain also, every day process chain is running successfully but it is showing zero records, but there is a data in Data source (RSA3), means delta is not working.
What is the problem,
Is there any another settings for delta loads,
Please help me to do this,
Helpful answer will be appreciated with points,
Regards,
VenkatHi,
Try this delta type :NEW STATUS FOR CHANGED RECORDS.
More information from Nagesh, this information may help
1. New status for changed Records.
2. Additive delta.
New status for changed records means
New Order
order no quantity
1000 10
order changed to quntity 8. then 2 records will posted 2 BW in first case.
1000 -10
1000 8
if you come to Additve delta.
for the same case it will send the same record as
1000 10
changed records
1000 -2
this is the difference. New status with changed records should be used only in association with ODS not cube becoz we don't have overwrite option.
Additive delta we can use for ODS and Cubes with update mode Addition.
Regards,
Satya -
Problem in extraction crm data sorce 0BP_ROLES_ATTR
Hi ,
during extracting the data source 0BP_ROLES_ATTR i sucessfully upladed data in to psi but when i tried to extract the data to ODS i am getting the error as
The value '99991231235959 ' from field VALID_TO is not convertible into
the DDIC data type DATS of the InfoObject in data record 217 . The field
content could not be transferred into the communication structure
format.
System Response
The data to be loaded has a data error or field VALID_TO in the transfer
structure is not mapped to a suitable InfoObject.
The conversion of the transfer structure to the communication structure
was terminated. The processing of data records with errors was continued
in accordance with the settings in error handling for the InfoPackage
(tab page: Update Parameters).Hi Shahina,
0VALID_TO has data element DATS which contains date in YYYYMMDD format. The data coming from the datasource contains YYYYMMDDHHMMSS. What you need to do is filter out the HHMMSS part. For this you can either write a transformation routine or use a formula in your transformation. -
How to populate LIS data ?
Hi all,
Can anybody tell me how the following tables gets populated.
S021,S022,S023
S024,S025,S026
S027,S034,S035
S094,S225,S226,
S227,S987,S998.
Thanks,
SyedCheck the LIS config in LBW1 transaction for the Infostructures (tables) that you require.
SAP built infostructures are populated via the normal SAP transaction.
For seeing the Info structure, go to transaction MC23. For create, go to MC21.
For details regarding Infostructures check the tables starting with MCS
Custom infostructures start with S5nn.
Also check LBW0 (Interface LIS Information Structures).
When the Transaction 'LBW0' is run for a LIS structure, a check is made in
the Table 'TMCBIWU'. If the Structure exists in this Table, it is available
as a DataSource for extraction of data to BIW. If it does not exist it is
not available and we are unable to generate the Extraction Structure.
When we create a customized LIS Structure i.e. 'Snnn' where 'nnn' > 499, we
usually create it with 'Type of IS' as 'T' which means it is available for
Transfer to BIW, an entry in the Table is created in the background. Hence
we do NOT face a problem here.
Also, most of the SAP Supplied LIS Structures are created of the type
'Standard' however an entry for these is provided in the Table 'TMCBIWU'.
Since we cannot change the Type of IS of SAP supplied LIS Structures and we
cannot create an entry in Table 'TMCBIWU' (since maintenance is NOT
supported), all LIS Structures for which an entry does not exist in Table
TMCBIWU cannot be generated as DataSources. These have to be copied as a
customized structure with the Type of Information Structure as 'T' and be
made available as a DataSource.
Hope it helps. Reward points for all helpful answers.
Thanks,
Balaji
Edited by: Balaji Ganapathiraman on Mar 10, 2008 1:31 PM -
Error while extracting the data in R/3 production system,
Hi Team,
We got the following error while extracting the data in R/3 production system,
Error 7 When Sending an IDoc R3 3
No Storage space available for extending the inter 44 R3 299
No storage space available for extending the inter R3 299
Error in Source System RSM 340
Please guide us to fix the issueIt´s very difficult to help you without knowing
- what is going to be transferred
- where you get this error
- system configuration
- actual memory usage
- operating system
- database and configuration etc. etc.etc. etc.
I suggest you open an OSS call and let the support have a look on your system. It´s much easier if one has system access to find out the cause for that problem.
Markus -
Unable to extract the data from ECC 6.0 to PSA
Hello,
I'm trying to extract the data from ECC 6.0 data source name as 2LIS_11_VAHDR into BI 7.0
When i try to load Full Load into PSA , I'm getting following error message
Error Message: "DataSource 2LIS_11_VAHDR must be activated"
Actually the data source already active , I look at the datasource using T-code LBWE it is active.
In BI on datasource(2LIS_11_VAHDR) when i right click selected "Manage" system is giving throughing below error message
"Invalid DataStore object name /BIC/B0000043: Reason: No valid entry in table RSTS"
If anybody faced this error message please advise what i'm doing wrong?
Advance thanksECC 6.0 side
Delete the setup tables
Fill the data into setup tables
Schedule the job
I can see the data using RSA3 (2LIS_11_VAHDR) 1000 records
BI7.0(Service Pack 15)
Replicate the datasource in Production in Backgroud
Migrate Datasource 3.5 to 7.0 in Development
I did't migrate 3.5 to 7.0 in Production it's not allowing
When i try to schedule the InfoPakage it's giving error message "Data Source is not active"
I'm sure this problem relate to Data Source 3.5 to 7.0 convertion problem in production. In Development there is no problem because manually i convert the datasource 3.5 to 7.0
Thanks -
Not able to extract performance data from .ETL file using xperf commands.
Xperf Commands:
xperf –i C:\TempFolder\Test.etl -o C:\TempFolder\BootData.csv –a process
Getting following error after executing above command:
"33288636 Events were lost
in this trace.
Data may be unreliable
This is usually caused
by insufficient disk bandwidth for ETW lo
gging.
Please try increasing the minimum
and maximum number of buffers
and/or
the buffer size.
Doubling these values would be a good first at
tempt.
Please note, though, that
this action increases the amount of me
mory
reserved
for ETW buffers, increasing memory pressure on your sce
nario.
See "xperf -help start"
for the associated command line options."
I changed page size file but its does not work for me.
Any one have idea, how to solve this problem and extract ETL file data.I want to mention one point here. I have total 4 machines out of these 3 machines above
commands working properly. Only one machine has this problem.<o:p></o:p>
Hi,
I consider that you can try to use xperf to collect the trace etl file and see if it can be extracted on this computer:
Refer to following articles:
start
http://msdn.microsoft.com/en-us/library/windows/hardware/hh162977.aspx
Using Xperf to take a Trace (updated)
http://blogs.msdn.com/b/pigscanfly/archive/2008/02/16/using-xperf-to-take-a-trace.aspx
Kate Li
TechNet Community Support -
Infoobject change for extracting texts data.
Hi BW guys,
Here is my requirement.
I have one info object 'salesmen', which is already used in some other ODS & Cube's.
Now I want to extract texts data for the object 'salesmen', for that I will need to change my infoobject (changes are : adding credit control are object under compounding).
But while i am activating the info object again it is giving errors.
Error messages:
1) InfoObject XXXXX (or ref.) is used in data targets with data -> Error:
2) Characteristic XXXXX: Compound or reference was changed
3)InfoObject XXXXX being used in InfoCube XXXX (contains data)
etc....
But i don't want to delete the data in any data target.
Is there any way to solve this problem?
Thanks in advance......Hi,
If you have not many cubes and ODSs with this salesman, you can consider another, beter, but more time-consuming way.
1. Create a new IO for your salesman, add a compounding attribute as you want.
2. Load master data for the new IO.
3. Create copies of your infoproviders.
3. In each of them delete an old salesman IO and insert a new one.
4. Create export datasourses for old cubes.
5. Create update rules for new data targets based on old ones.
6. In URs map your new IO with the old one. All other IOs should be mapped 1:1 (new<-old).
7. Reload data targets.
That's all.
The way I proposed earlier is less preferrable. Because anyway you'll have to change loaded into data targets data. And in this case it's better to change data model as you want.
Best regards,
Eugene -
How to extract the data from R/3?
Gurus,
I need to extract the data from R/3 to BI . Below i am giving Table name followed by field name and field description.
I am using BI 7.0 and ECC 6.0
I am having following doubts.
1. Based on below information which extract method I need to use (Generic data Extraction?)?
2. I try to create a one view but I could not find Join between the tables. I found only few joins. how to join other tables?
3. Without joining these table how can I collect the data ?
4. Do i need to create multiple tables?
SAP Table SAP Field Field Description
VBAK-> VBELN-> Sales Document
VBAP-> WERKS-> Plant
VBAK-> AUART-> Doc Type
VBAP-> PSTYV-> Item Category
VBAP-> MATNR-> Material
VBAP-> ARKTX-> Description
MATNR-> GROES->
VBAP-> ZIEME-> Target Quantity UOM
VBAP-> WAVWR-> Cost
VBAK-> ERNAM-> Name of a person who created the object
VBAKA-> KUNNR-> Customer Number 1
VBAP-> BSTNK-> Customer purchase Order no
VBEP-> BANFN-> Purchase Requisition Number
QMFE-> FEGRP-> Cause Code(Code Group Problem)
QMFE-> FECOD-> Damage Code(Problem or Damage Code)
QMEL-> QMNUM-> Notification No
EQUI-> EQUNR-> Equipment Number
QMEL-> QMART-> Notification Type
QMELI-> ERDAT-> Date on Which Record Was Created
============================================================================================
Gurus I will assign full points .
ThanksHi,
I would use the standard extractor 2lis_11_vaitm and enhance it for the following fields:
VBEP-> BANFN-> Purchase Requisition Number
QMFE-> FEGRP-> Cause Code(Code Group Problem)
QMFE-> FECOD-> Damage Code(Problem or Damage Code)
QMEL-> QMNUM-> Notification No
EQUI-> EQUNR-> Equipment Number
QMEL-> QMART-> Notification Type
QMELI-> ERDAT-> Date on Which Record Was Created
regards
Siggi -
When extracting SD data from R/3, err in tRFC raised that "No service for."
Hello fellows,
when I'm extracting SD data from R/3, an error raised about IDOC that "No service for system SAPSB2, client 100 in Integration Directory", and there is no work processing working in the R/3 system, could anybody help? Thanks!
JenniferHi,
You are extracting Data from ECC, so here no way related to XI, I think, the problem is with Parner Profile and IDoc settings, so chek it in WE20.
or
take IDoc nos from BW-->RSMO (in the detailed tab bottom you can find OLTP IDoc nos) and then goto ECC, BD87 and give that IDoc and check,
And also check in WE05 what is the Idoc Statu, this is connection problem only, so check with BAsis team.
Thanks
reddy -
Problem in populating data to Logical System field
Hi All,
I am extracting GR data through Generic extraction. I am created a view using four tables MSEG, EKBE, EKKO and T001. I have a field LOGSY (Logical System). This field is in EKKO table. My problem is LOGSY field have no any data. I have to write a routine to populate data to this field. Can anybody plz tell me the code?
Edited by: Kasi Viswanath Aurangabad on Jun 18, 2008 7:50 AMHello,
the code is:
SELECT SINGLE logsys
FROM T000
INTO your_variable
WHERE mandt = sy-mandt.
Assign points if useful,
Best Regards,
Maciej
Maybe you are looking for
-
Free Goods Scenario..
Hello everyone, I am testing one scenario.. please guide me.. in xk02 i have activated "Grand Discount in Kind" in MM02 purchasing view i have entered Qual for Free good dis as "1". i have maintained the purchasing info record also in MBN1/MBN2. and
-
Can I exchange my macbook air for the new pro?
I recently bought a haswell series 13" macbook air exactly 1 week before the introduction of the new haswell macbook pros. After considering the spec and price difference I realized I'd rather have the pro. I looked on the returns tab on the website
-
My Mac keeps crashing (Mid 2010, 13 inch MbPro, 10.10)
My mac keeps crashing every now and then (ca. a couple of times week). I have upgraded the ram and put in an SSD. This problem has happened on mavericks and after i upgraded it to Yosemite. Safari is also crashing every now and then. Can someone help
-
Error REP-0999 using Repository Reports
Hello ! I have problems with Repository Reports tool from Oracle Designer 6i. I want to save a previewed reports in RTF format, but when I use the command "generate to file" of the previewer, I get an error REP-0999 (unimplemented error with no more
-
So I brought BT infinity.. got sent a router that ...
Like the title says, I recently brought BT infinity, on installation day got sent a router/hub in the post, then that was it.. nobody came to install anything etc, I plugged the router in myself which is connected via adsl but im faily certain you ca