DataSource and PreparedStatement
I am using DataSource and PreparedStatement in weblogic8.1 for a J2EE application.
It works if we use only EJB..but if we use enterprise app. i.e. servlets,jsp,ejb etc.. it does not work.
Can somebody tell the reason behind it??
Why do you feel this is an issue with the PreparedStatements and not just the DataSource? Are you sure you have configured the DataSource correctly for your platforms outside your ejb container?
Similar Messages
-
Too many connections - even after closing ResultSets and PreparedStatements
I'm getting a "Too many connections" error with MySQL when I run my Java program.
2007-08-06 15:07:26,650 main/CLIRuntime [FATAL]: Too many connections
com.mysql.jdbc.exceptions.MySQLNonTransientConnectionException: Too many connections
at com.mysql.jdbc.SQLError.createSQLException(SQLError.java:921)
at com.mysql.jdbc.MysqlIO.checkErrorPacket(MysqlIO.java:2870)
at com.mysql.jdbc.MysqlIO.checkErrorPacket(MysqlIO.java:812)
at com.mysql.jdbc.MysqlIO.secureAuth411(MysqlIO.java:3269)
at com.mysql.jdbc.MysqlIO.doHandshake(MysqlIO.java:1182)
at com.mysql.jdbc.Connection.createNewIO(Connection.java:2670)I researched on this and found out that I wasn't closing the ResultSet and the PreparedStatement.
The JDBC connection is closed by a central program that handles connections (custom connection pooling).
I added the code to close all ResultSets and PreparedStatements, and re-started MySQL as per the instructions here
but still get "Too many connections" error.
A few other things come to mind, as to what I may be doing wrong, so I have a few questions:
1) A few PreparedStatements are created in one method, and they are used in a 2nd method and closed in the 2nd method
does this cause "Too many connections" error?
2) I have 2 different ResultSets, in nested while loops where the outer loop iterates over the first ResultSet and
the inner loop iterates over the second ResultSet.
I have a try-finally block that wraps the inner while loop, and I'm closing the second ResultSet and PreparedStement
in the inner while loop.
I also have a try-finally block that wraps the outer while loop, and I'm closing the first ResulSet and PreparedStatement
in the outer while loop as soon as the inner while loop completes.
So, in the above case the outer while loop's ResultSet and PreparedStatements remain open until the inner while loop completes.
Does the above cause "Too many connections" error?
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
The following is relevant sections of my code ( it is partially pseudo-code ) that shows the above 2 cases:
init( Connection jdbcConnection ){
String firstSQLStatement = "....";
PreparedStatement ps1 = jdbcConnection.prepareStatement( firstSQLStatement );
String secondSQLStatement = "....";
PreparedStatement ps2 = jdbcConnection.prepareStatement( secondSQLStatement );
String thirdSQLStatement = "....";
PreparedStatement ps3 = null;
ResultSet rsA = null;
try{
ps3 = jdbcConnection.prepareStatement( thirdSQLStatement );
rsA = ps3.executeQuery();
if( rsA.next() ){
rsA.getString( 1 );
}finally{
if( rsA != null )
rsA.close();
if( ps3 != null )
ps3.close();
//Notice, how ps1 and ps2 are created here but not used immediately, but only ps3 is
//used immediately.
//ps1 and ps2 are used in another method.
run( Connection jdbcConnection ){
ResultSet rs1 = ps1.executeQuery();
try{
while(rs1.next()){
String s = rs1.getString();
ps2.setString(1, s);
ResultSet rs2 = ps2.executeQuery();
try{
while(rs2.next()){
String s2 = rs2.getString();
}finally{
if( rs2 != null )
rs2.close();
if( ps2 != null )
ps2.close();
}catch( Exception e ){
e.printStackTrace();
}finally{
if( rs1 != null )
rs1.close();
if( ps1 != null )
ps1.close();
//Notice in the above case rs1 and ps1 are closed only after the inner
//while loop completes.
}I appreciate any help.Thanks for your reply.
I will look at the central connection pooling mechanism ( which was written by someone else) , but that is being used by many other Java programs others have written.
They are not getting this error.
An addendum to my previous note, I followed the instructions here.
http://dev.mysql.com/doc/refman/5.0/en/too-many-connections.html
There's probably something else in my code that is not closing the connection.
But I just wanted to rule out the fact that opening a PreparedStatement in one method and closing it in another is not a problem.
Or, if nested ResultSet loops don't cause the problem.
I've read in a few threads taht "Too many connections" can occur for unclosed RS and PS , and not just JDBC connections. -
How to use DataSource and External transaction in 9ias?
I'm working on a project that the application server needs to connect to over 100 databases.
I'd like to use connection pooling and external transaction service defined in OC4J's Datasources.
I wonder if anyone has an example of using datasource and external transaction service for OC4J.
Right now, I export toplink project to a java source and do the initialization there manually but I don't know how to use Datasource to get connections and how to use the external transaction service in the java code for OC4J.
I really appreciate you help.
WeiHere is a fill in the blank example on how you could set this up through code:
Project project = new MyProject();
// alternatively, use the XMLProjectReader
server = project.createServerSession();
server.getLogin().useExternalConnectionPooling();
server.getLogin().setConnector(new JNDIConnector(new javax.naming.InitialContext(), "jdbc/DataSourceName"));
// the next line depends on the type of driver you want to use.
server.getLogin().useOracleThinJDBCDriver();
server.getLogin().useOracle();
server.getLogin().setUserName("username");
server.getLogin().setPassword("password");
server.getLogin().useExternalTransactionController();
server.setExternalTransactionController(new Oracle9iJTSExternalTransactionController());
server.logMessages();
server.login(); -
Compare creation of Datasources and JMS Queues : SAP vs (Weblogic/Websphere
I am used to creating JDBC Datasources and JMS Queues on Weblogic/Websphere thru their Admin applications.
Can someone compare/contrast that process to that on SAP's netweaver ( either using NWA or Visual Administrator).
ThanksHi Parag,
For the process of creating JDBC datasources and JMS resources @ SAP NetWeaver you can refer to the documents here on SDN and help.sap.com, and compare that process for yourself, thus not being influenced by others' biased or unbiased opinions.
For NetWeaver 04 and 04s these would be:
<a href="http://help.sap.com/saphelp_nw04/helpdata/en/b0/6e62f30cbe9e44977c78dbdc7a6b27/frameset.htm">JDBC Connector Service</a>
<a href="http://help.sap.com/saphelp_nw04/helpdata/en/22/cf4e71c46cdb4da31153be96c5389f/frameset.htm">JMS Connector Service</a>
For the <a href="https://www.sdn.sap.comhttp://www.sdn.sap.comhttp://www.sdn.sap.com/irj/sdn/javaee5">Java EE 5 Edition</a>:
<a href="https://www.sdn.sap.comhttp://www.sdn.sap.comhttp://www.sdn.sap.com/irj/servlet/prt/portal/prtroot/docs/library/uuid/7bb9751d-0e01-0010-febd-c3adce2c408c">Working with Database Tables, DataSources and JMS Resources</a>
<a href="https://www.sdn.sap.comhttp://www.sdn.sap.comhttp://www.sdn.sap.com/irj/servlet/prt/portal/prtroot/docs/library/uuid/806e75a0-0e01-0010-2587-fc518de8ac1a">Administration Guide</a> -> section "Application Resources Management" (pages 89-104)
Hope that helps!
Your feedback/findings are very welcome!
-Vladimir -
What are the settings for datasource and infopackage for flat file loading
hI
Im trying to load the data from flat file to DSO . can anyone tel me what are the settings for datasource and infopackage for flat file loading .
pls let me know
regards
kumarLoading of transaction data in BI 7.0:step by step guide on how to load data from a flatfile into the BI 7 system
Uploading of Transaction data
Log on to your SAP
Transaction code RSA1LEAD YOU TO MODELLING
1. Creation of Info Objects
In left panel select info object
Create info area
Create info object catalog ( characteristics & Key figures ) by right clicking the created info area
Create new characteristics and key figures under respective catalogs according to the project requirement
Create required info objects and Activate.
2. Creation of Data Source
In the left panel select data sources
Create application component(AC)
Right click AC and create datasource
Specify data source name, source system, and data type ( Transaction data )
In general tab give short, medium, and long description.
In extraction tab specify file path, header rows to be ignored, data format(csv) and data separator( , )
In proposal tab load example data and verify it.
In field tab you can you can give the technical name of info objects in the template and you not have to map during the transformation the server will automatically map accordingly. If you are not mapping in this field tab you have to manually map during the transformation in Info providers.
Activate data source and read preview data under preview tab.
Create info package by right clicking data source and in schedule tab click star to load data to PSA.( make sure to close the flat file during loading )
3. Creation of data targets
In left panel select info provider
Select created info area and right click to create ODS( Data store object ) or Cube.
Specify name fro the ODS or cube and click create
From the template window select the required characteristics and key figures and drag and drop it into the DATA FIELD and KEY FIELDS
Click Activate.
Right click on ODS or Cube and select create transformation.
In source of transformation , select object type( data source) and specify its name and source system Note: Source system will be a temporary folder or package into which data is getting stored
Activate created transformation
Create Data transfer process (DTP) by right clicking the master data attributes
In extraction tab specify extraction mode ( full)
In update tab specify error handling ( request green)
Activate DTP and in execute tab click execute button to load data in data targets.
4. Monitor
Right Click data targets and select manage and in contents tab select contents to view the loaded data. There are two tables in ODS new table and active table to load data from new table to active table you have to activate after selecting the loaded data . Alternatively monitor icon can be used.
Loading of master data in BI 7.0:
For Uploading of master data in BI 7.0
Log on to your SAP
Transaction code RSA1LEAD YOU TO MODELLING
1. Creation of Info Objects
In left panel select info object
Create info area
Create info object catalog ( characteristics & Key figures ) by right clicking the created info area
Create new characteristics and key figures under respective catalogs according to the project requirement
Create required info objects and Activate.
2. Creation of Data Source
In the left panel select data sources
Create application component(AC)
Right click AC and create datasource
Specify data source name, source system, and data type ( master data attributes, text, hierarchies)
In general tab give short, medium, and long description.
In extraction tab specify file path, header rows to be ignored, data format(csv) and data separator( , )
In proposal tab load example data and verify it.
In field tab you can you can give the technical name of info objects in the template and you not have to map during the transformation the server will automatically map accordingly. If you are not mapping in this field tab you have to manually map during the transformation in Info providers.
Activate data source and read preview data under preview tab.
Create info package by right clicking data source and in schedule tab click star to load data to PSA.( make sure to close the flat file during loading )
3. Creation of data targets
In left panel select info provider
Select created info area and right click to select Insert Characteristics as info provider
Select required info object ( Ex : Employee ID)
Under that info object select attributes
Right click on attributes and select create transformation.
In source of transformation , select object type( data source) and specify its name and source system Note: Source system will be a temporary folder or package into which data is getting stored
Activate created transformation
Create Data transfer process (DTP) by right clicking the master data attributes
In extraction tab specify extraction mode ( full)
In update tab specify error handling ( request green)
Activate DTP and in execute tab click execute button to load data in data targets. -
What is the different between statement and preparedstatement?
hi,
recently i have attended a telephonic interview. they asked me what is the different between statement and preparedstatement? and when u will use them? Hi! can any one say me the original difference??sorry dear,
i am already shortlisted. and monday is my HR round.
. Every 1 is not like u.
So you have read the examples and explanations that you found when you googled, and you have read the javadoc and you still don't understand? And you are shortlisted? For what? I hope you won't do server programming.
I will give you a few hints.
Escaping of data
Storing of dates
Safer
Faster -
Performance Problem - MS SQL 2K and PreparedStatement
Hi all
I am using MS SQL 2k and used PreparedStatement to retrieve data. There is strange and serious performance problem when the PreparedStatement contains "?" and using PreparedStatement.setX() functions to set its value. I have performed the test with the following code.
for (int i = 0; i < 10; i ++) {
try {
con = DBConnection.getInstance();
statement = con.prepareStatement("SELECT * FROM cardno WHERE car_no = '" + cardNo + "'");
// statement = con.prepareStatement("SELECT * FROM cardno WHERE car_no = ?");
// statement.setString(1, cardNo);
rs = statement.executeQuery();
if (rs.next()) {
catch(SQLException e) {
e.printStackTrace();
finally {
try {
rs.close();
statement.close();
catch(SQLException e) {
e.printStackTrace();
Iteration Time (ms)
1 961
10 1061
200 1803
for (int i = 0; i < 10; i ++) {
try {
con = DBConnection.getInstance();
// statement = con.prepareStatement("SELECT * FROM cardno WHERE car_no = '" + cardNo + "'");
statement = con.prepareStatement("SELECT * FROM cardno WHERE car_no = ?");
statement.setString(1, cardNo);
rs = statement.executeQuery();
if (rs.next()) {
catch(SQLException e) {
e.printStackTrace();
finally {
try {
rs.close();
statement.close();
catch(SQLException e) {
e.printStackTrace();
Iteration Time (ms)
1 1171
10 2754
100 18817
200 36443
The above test is performed with DataDirect JDBC 3.0 driver. The one uses ? and setString functions take much longer to execute, which supposed to be faster because of precompilation of the statement.
I have tried different drivers - the one provided by MS, data direct and Sprinta JDBC drivers but all suffer the same problem in different extent. So, I am wondering if MS SQL doesn't support for precompiled statement and no matter what JDBC driver I used I am still having the performance problem. If so, many O/R mappings cannot be used because I believe most of them if not all use the precompiled statement.
Best regards
EdmondEdmond,
Most JDBC drivers for MS SQL (and I think this includes all the drivers you tested) use sp_executesql to execute PreparedStatements. This is a pretty good solution as the driver doesn't have to keep any information about the PreparedStatement locally, the server takes care of all the precompiling and caching. And if the statement isn't already precompiled, this is also taken care of transparently by SQL Server.
The problem with this approach is that all names in the query must be fully qualified. This means that the driver has to parse the query you are submitting and make all names fully qualified (by prepending a db name and schema). This is why creating a PreparedStatement takes so much using these drivers (and why it does so every time you create it, even though it's the same PreparedStatement).
However, the speed advantage of PreparedStatements only becomes visible if you reuse the statement a lot of times.
As about why the PreparedStatement with no placeholder is much faster, I think is because of internal optimisations (maybe the statement is run as a plain statement (?) ).
As a conclusion, if you can reuse the same PreparedStatement, then the performance hit is not so high. Just ignore it. However, if the PreparedStatement is created each time and only used a few times, then you might have a performance issue. In this case I would recommend you try out the jTDS driver ( http://jtds.sourceforge.net ), which uses a completely different approach: temporary stored procedures are created for PreparedStatements. This means that no parsing is done by the driver and PreparedStatement caching is possible (i.e. the next time you are preparing the same statement it will take much less as the previously submitted procedure will be reused).
Alin. -
DataSources and extract structures generation
Hi,
How would I generate the DataSources and extract structures using transaction LBWE? It seems that there are no push buttons to say to generate the DataSources and extract structures from LBWE? Would clicking on the DataSources and extract structures and click on the continue button be enough?
ThanksI check from LBWE maintenance structure column, and the fields are there on the left hand side for the selection criteria.
How would I check "..<i>fields you have enhaced are ready to use</i>...." from LBWE.
From RSA6, I would doubleclick on the LIS datasource, and I see no enhanced fields and therefore could not see if they are hidden.
"<i>Did you check in LBWE if your strcutures have the fields ready to use?" </i>How would I do this? Maybe this is the problem.
Thanks
Message was edited by:
NoNoNo ... -
Datasources and Extract structure
Hi guys,
I am pretty confused about DataSources and Extract structure.
Can someone please explain it me in simple words.Hi Zubin,
Data Source is a consoliated list of fields available and the extract structure is a data dictionary structure which illustrates the fields with additional technical elements like DATA ELEMENT , DOMAIN ans so on...
To make it more clear look at this description from F4 help:
A DataSource is an object for retrieving data. The DataSource is localized in the OLTP system.
It has
an extract structure,
an extraction type, and
an extraction method.
The extract structure describes the fields of the internal table that contain the extracted data. The extraction type describes which type of extractor the DataSource uses. The extraction method extracts the data and transfers it into an internal table with the same type as the extract structure.
The DataSource also contains information on the type of data that it stages, for example, attributes, transactional data, hierachies, or texts. It can also support different types of data update.
Extract Structure for a DataSource
The extraction structure for a data source shows the format in which the DataSource, or the extractor for the DataSource, transfers its data.
A data element must be assigned to each field of the extraction structure. This allows in the Business Information Warehouse an intelligent mapping between field names and InfoObjects using just this data element.
The extract structure must be created in the DDIC as a dictionary structure or transparent table. A view is not permitted here since it would then not give you the option to add an append.
Appends enable you to convert your individual requirements and own "Business Logic" in the extraction process. You can fill the fields in the append using function enhancements.
Hope this helps
Thanks,
Raj -
Business scenarios on SD and MM modules for generic datasources and enhance
Hi,
Can anybody send some documents on SD and MM module on generic datasource and on enhancements.Pl send it to my MailID:[email protected]
Thanks,
ChinnaHi,
MM Process flow:
The typical procurement cycle for a service or material consists of the following phases:
1. Determination of Requirements
Materials requirements are identified either in the user departments or via materials planning and control. (This can cover both MRP proper and the demand-based approach to inventory control. The regular checking of stock levels of materials defined by master records, use of the order-point method, and forecasting on the basis of past usage are important aspects of the latter.) You can enter purchase requisitions yourself, or they can be generated automatically by the materials planning and control system.
2. Source Determination
The Purchasing component helps you identify potential sources of supply based on past orders and existing longer-term purchase agreements. This speeds the process of creating requests for quotation (RFQs), which can be sent to vendors electronically via SAP EDI, if desired.
3. Vendor Selection and Comparison of Quotations
The system is capable of simulating pricing scenarios, allowing you to compare a number of different quotations. Rejection letters can be sent automatically.
4. Purchase Order Processing
The Purchasing system adopts information from the requisition and the quotation to help you create a purchase order. As with purchase requisitions, you can generate Pos yourself or have the system generate them automatically. Vendor scheduling agreements and contracts (in the SAP System, types of longer-term purchase agreement) are also supported.
5. Purchase Order Follow-Up
The system checks the reminder periods you have specified and - if necessary - automatically prints reminders or expediters at the predefined intervals. It also provides you with an up-to-date status of all purchase requisitions, quotations, and purchase orders.
6. Goods Receiving and Inventory Management
Goods Receiving personnel can confirm the receipt of goods simply by entering the Po number. By specifying permissible tolerances, buyers can limit over- and underdeliveries of ordered goods.
7. Invoice Verification
The system supports the checking and matching of invoices. The accounts payable clerk is notified of quantity and price variances because the system has access to PO and goods receipt data. This speeds the process of auditing and clearing invoices for payment.
Common Tables used by SAP MM:
Below are few important Common Tables used in Materials Management Modules:
EINA Purchasing Info Record- General Data
EINE Purchasing Info Record- Purchasing Organization Data
MAKT Material Descriptions
MARA General Material Data
MARC Plant Data for Material
MARD Storage Location Data for Material
MAST Material to BOM Link
MBEW Material Valuation
MKPF Header- Material Document
MSEG Document Segment- Material
MVER Material Consumption
MVKE Sales Data for materials
RKPF Document Header- Reservation
T023 Mat. groups
T024 Purchasing Groups
T156 Movement Type
T157H Help Texts for Movement Types
MOFF Lists what views have not been created
A501 Plant/Material
EBAN Purchase Requisition
EBKN Purchase Requisition Account Assignment
EKAB Release Documentation
EKBE History per Purchasing Document
EKET Scheduling Agreement Schedule Lines
EKKN Account Assignment in Purchasing Document
EKKO Purchasing Document Header
EKPO Purchasing Document Item
IKPF Header- Physical Inventory Document
ISEG Physical Inventory Document Items
LFA1 Vendor Master (General section)
LFB1 Vendor Master (Company Code)
NRIV Number range intervals
RESB Reservation/dependent requirements
T161T Texts for Purchasing Document Types
Transaction Codes:
RFQ to Vendor - ME41
Raising Quotation - ME47
Comparison of Price - ME49
Creation of PO - ME21N
Goods Receipt - MIGO
Invoice (Bill PAssing) - MIRO
Goods Issue - MB1A
Physical Inventory - MI01( Create doc)
MI04 (Enter Count)
MI07 (Post)
Also please check this links.
http://www.sapgenie.com/sapfunc/mm.htm
http://www.sap-basis-abap.com/sapmm.htm
SD Process Flow:
The sales documents you create are individual documents but they can also form part of a chain of inter-related documents. For example, you may record a customers telephone inquiry in the system. The customer next requests a quotation, which you then create by referring to the inquiry. The customer later places an order on the basis of the quotation and you create a sales order with reference to the quotation. You ship the goods and bill the customer. After delivery of the goods, the customer claims credit for some damaged goods and you create a free-of-charge delivery with reference to the sales order. The entire chain of documents the inquiry, the quotation, the sales order, the delivery, the invoice, and the subsequent delivery free of charge creates a document flow or history. The flow of data from one document into another reduces manual activity and makes problem resolution easier. Inquiry and quotation management in the Sales Information System help you to plan and control your sales.
Transaction Codes:
Inquiry - VA11/VA12/VA13
Quotation - VA21/VA22/VA23
Sales Order - VA01/VA02/VA03
Delivery - VL01N/VL02N/VL03N
Billing/Invoicing - VF01/VF02/VF03
Also please check this links.
http://www.sapgenie.com/sapfunc/sd.htm
http://www.sap-basis-abap.com/sapsd.htm
http://www.sapgenie.com/abap/tables_sd.htm
Production Planning:
For example, consider a pump manufacturing plant, based on the customer requirement, planning is done for future months ( we plan for a qty on particular
date). Pump is an assembly - were main component would be manufactured in the plant and others would be procured. As PP, we are concerned only in the inhouse
manufacturing but the final assembly (considering BOM) can be done only if the procured components are available. MRP helps in planning the shortage, on
particular date based on the planned date. Work center (ex. lathe) place the components are machined/assembled (were the operation is done). Sequence of
operation is routing (lead time scheduling data is got from routing). In the MRP Run,basic or lead time scheduling (need to know when to start/finish date)
is done. On creation of the production order, system checks for which BOM and routing to be picked up (if there are many routings or BOM for that particular
finished product). Availabilty checks for material,PRT and capacity needs to be done. on release of the order, confirmation of the order can be done (on
completion of the order or after manufacturing the quantities). GI and GR have to be done.PP flow ends here
step 1: creation of master data (Material master,BOM,Work center,Routing)
step 2: Planning - can be done by Planned independent requirement (MD61), Independent requirement (MD81).
we plan for a quantity, on which date (it would be finish date).
step 3: MD04 -stock/requirement lsit (plan made can be viewed in MD04)
step 4: MRP run - MD02, PIR is converted into Planned order
step 5:Planned order to be converted production order - CO40, CO41
step 6: production order to be released - CA02
step 7: confirm the production order (order confirmation-CO15 (after which GI and GR is done)
assign pts if helpful... -
How to find datasource and standard cube for given R/3 table
Hi All,
Can anyone help me that how to find a datsource, standard cube and Standard BI reports for given R/3 table.
For example if am having the table names "EKKO (Header)
EKPO (Line item)
EKET (Schedule Lines)
EKBE (PO History)
EKKN (Account Assignment/WBS)
EKAN/EKPA (Partner Info)" for all these tables how can i find the standard datasource and standard Cube.
Please help in this. I will assign the points.
Regards,
Shanthihttps://wiki.sdn.sap.com/wiki/display/BI/BWSDMMFIDATASOURCES
MM
Data Sources Tables
Purchasing
2LIS_02_SCL EKKO, EKBE, T001, T001W, EKET, EKPA.
2LIS_02_HDR EKKO, EKBE, T001, EKPA.
2LIS_02_ITM EKKO, EKBE, T001, T001W, EKPO, TMCLVBW, T027C, ESSR, T147K, T147
2LIS_02_SCN EKET, EKES, EKPO.
2LIS_02_CGR EKBE, EKES, EKPO.
2LIS_02_SGR EKET, EKBE, KKPO
Inventory
2LIS_03_BX stock tables, MCHA, MARA, T001, T001W, CALCULATED FROM MBEW, EBEW, QBEW.
2LIS_03_BF MSEG, MBEW, MKPF.
2LIS_03_UM BKPF, MBEW, QBEW, EBEW, BSEG.
MM
Purchasing Datasources
ODS
0PUR_O01 2LIS_02_ITM, 2LIS_02_SGR, 2LIS_02_CGR, 2LIS_02_SCN.
0PUR_O02 2LIS_02_HDR, 0PUR_O01
0PUR_DS03 2LIS_02_SCL and 2LIS_02_SGR.
CUBE
0PUR_C10 2LIS_02_SCL and 2LIS_02_SGR.
0PUR_C07
0PUR_C08
0PUR_C09 0PUR_O02, 80PUR_O01, 2LIS_02_HDR
0SRV_C01 2LIS_02_S174
0PUR_C04 2LIS_02_S011, 2LIS_02_SCL, 2LIS_02_ITM, 2LIS_02_HDR
0PUR_C01 2LIS_02_S012, 2LIS_02_ITM, 2LIS_02_SCL
0PUR_C02 2LIS_02_S013
0PUR_C05 0MM_PUR_VE_01
0PUR_C06 0MM_PUR_VE_02
0PUR_C03 2LIS_02_S015
Inventory Management
CUBE
0IC_C03 2LIS_03_BX, 2LIS_03_BF, 2LIS_03_UM
0IC_C02 2LIS_03_S195, 2LIS_03_S197
0IC_C01 2LIS_03_S196, 2LIS_03_S198
SD
Data sources Tables
2LIS_11_VAKON VBUK, VBUP, VBAK, VBAP, VBKD, KOMV, T001.
2LIS_11_VAHDR VBAK, VBUK, T001
2LIS_11_VAITM VBAP, VBUP, VBAK, VBKD, VBAJP, T001, VBUK, PRPS.
2LIS_11_VASCL VBAP, VBUP, VBAK, VBEP, VBKD, T001, PRPS
2LIS_11_VASTH VBUK
2LIS_11_VASTI VBUP, VBUK
2LIS_11_V_ITM VBAP, VBAK, VBKD, VBUP, T001, PRPS, VBUK.
2LIS_11_V_SCL VBUP, VBEP, VBKD, VBAP, VBAK, T001, PRPS.
2LIS_11_V_SSL VBAP, VBEP, LIPS, WVBEP, VBUP
2LIS_13_VDKON VBUK, VBRP, KOMV, T001, VBRK.
FI
DATASOURCES Tables
Cost Center Accounting
0CO_OM_CCA_1 COSP, COSS
0CO_OM_CCA_2 COSP, COSS
0CO_OM_CCA_3 COSL, COKL.
0CO_OM_CCA_4 COSR
0CO_OM_CCA_5 COSB
0CO_OM_CCA_6 COOMCO
0CO_OM_CCA_7 BPJA, BPPE.
0CO_OM_CCA_8 COST, TKA07,COOMCO, CSLA, COST, COKL.
0CO_OM_CCA_9 COVP (COEP& COBK), COSP, COST, COEJ, COEP, T001.
0CO_OM_CCA_10 COOI, COSP_VTYPE.
Product Cost Controlling
0CO_PC_PCP_01 KEKO, TCKH3, TCKH8.
0CO_PC_PCP_02 KEKO, TCKH3.
0CO_PC_PCP_03 CKIS, T001K, TKA02, KEKO, MARA, MBEW.
0CO_PC_PCP_04 CKIS, T001K, TKA02, KEKO, MBEW, MARA
0CO_PC_01 AUFK, AFPO, COSS, COSP, COSB, COSL, COKEY, TKA09, TKV09
0CO_PC_02 AUFK, AFPO, COSS, COSP, COSB, COSL, COKEY, TKA09, TKV09
Hope this Helps. -
Workbook as datasource and datatarget for APD
Hi Experts,
Can we have workbook as datasource and datatarget for APD.
Or any alternative way to achieve this?
Regards,
DineshHi Anil,
Thanks .. But I want Workbook as input and output for APD instead of Query.
Regards,
Dinesh -
Loading the cube from 3 datasources and getting 3 records for each keyfield
Hi All,
I am loading an InfoCube from 3 separated datasources. These 3 datasources are UD Datasources and their unique source system is UD Connect.
Each of the datasource contains a unique key field 'Incident Number' (same as we use have in Datasources for DSO).
The problem is, when I am loading data with these 3 datasources to the cube, for each 'Incident number' there becomes 3 records.
We have reports on this Infocube and the report also displays 3 records for each incident number.
If I remove Incident Number key field from 2 of the Datasources, the data from these datasources do not reach to the Cube.
For many of you, this may be a minor problem ( or may not be a problem at all !!! ) , but as a New Joinee in SAP field, this has become a showstopper issue for me.
Please suggest.
Thanks in Advance.Hi Pravender,
Thanks for your interest.
The scenario is, I have 3 datasources form the same source system, All the 3 datasources have different fields except 'Incident Number'. So, each and every field has only one value in the report. But due to 3 separate datasources, it creates 3 records displahying values of each datasource in a separate record.
There is no field in the query output which is having different values for the different source systems. Due to 3 records in the cube, one record will contain the value for a particular field and the other two records will show a Blank for that field.
Regards. -
Standerd BI Content DataSources and Cubes used for Cement industry
HI,
Kindly any one sugest the list of standerd BI Content DataSources,
and standerd BI Content Cubes used for Cement industry
related projects.
Standerd BI Content Datasources and cubes for modules
SD, Product costing , Inventry, and GL.
Kindly Sugest.
Regard,
prasad.Hi,
You can use normal InfoCubes/DataSource for the Cement Industry, recently I saw the design of the Cement Industry BI system, there also same InfoCubes and DataSourecs they used.
For Cubes and DataSource details see in SDN..
https://wiki.sdn.sap.com/wiki/display/BI/BWSDMMFIDATASOURCES
Thanks
Reddy -
Business content empty in BW 7.3 for datasources and infosources
Hi,
We recently installed BW 7.3 system. I went to business content to install data source and it is empty.
I double clicked RSDS and TRCS select object and they did not get the list of datasources and infosources.
I checked the system status and this is the information
soft comp - SAP BW
Release - 730
Level - 000
highest support - empty
short description - SAP Buiness warehouse
thanks
BhatHi,
We were missing BI CONTENT in the system. Basis installed BI content and it resolved the issue
Maybe you are looking for
-
How to block change of delivery quantity in VLO1N
The system should not allow the change of delivery quantity in VLO1N, say if in sales order the quantity is 10 & they change in VLO1n as 12
-
Hi All, It sounds simple but it makes a bit of a mess in the html. I have tried using convertto-html and also tried manually creating the html by creating a string with the header etc... as follows: $file = "c:\temp\a.html" "<html><head><title>Networ
-
Dynamic Table row breaks only on the first page (not on second)
Hello, I biuld a dynamic table with Livecycle. If the text in the last field is to long, the table grows automatically. If the end of the first page is reached, the row breaks over from the first to the second page(= 1st break) This works correctly.
-
How do you stop Apple from sending you a receipt from the iTunes Store to your house mailbox? Please help....
-
I have a mid-2010 27" iMac i5 with 16gb Ram and the ATI Radeon HD 5750 1024 MB. OS X 10.10 The problem first appeared with the upgrade to Mavericks. I reverted to the previous version of the OS but still saw them occasionally. I've since upgraded to