BFOK_A R/3 Table - Data Source and Standard Business Content
Hi All,
For the R/3 Table BFOK_A (FI subsequent BA/PC adjustment: Vendor items) please let me know which datasource should be used.
Also let me know the standard business contents related to this table.
Thanks,
Srinivas.
Hi All,
Please let me know the business content extractor for the BFOK_A Table.
Thanks
Srinivas.
Similar Messages
-
FI/CO module flow and standard business content InfoCubes , DataSources
Hi
Im new to BW. Can anyone tell me
1) Flow of FI/CO Module for BW.
2) What are the standard InfoCubes and ODS , DataSources for FI/CO module
Please let me know. I will assing the points
kumarHi,
Refer
http://help.sap.com/bp_biv235/BI_EN/html/BW/FinancialAccounting.htm
http://help.sap.com/bp_biv235/BI_EN/html/bw.htm
http://help.sap.com/saphelp_nw04/helpdata/en/65/7beb3cad744026e10000000a11405a/frameset.htm
FI CO Datasource/Cube/Reports..
http://help.sap.com/saphelp_nw04/helpdata/en/57/dd153c4eb5d82ce10000000a114084/frameset.htm
http://help.sap.com/saphelp_nw04/helpdata/en/8d/1f383fe58d5900e10000000a114084/frameset.htm
http://help.sap.com/saphelp_nw04s/helpdata/en/65/7beb3cad744026e10000000a11405a/frameset.htm
Thanks -
Table Data Source Search Result gives ClassCastException
I set up a table data source and queried it using the following URL:
http://machine_name:port/ultrasearch/query/search.jsp?usearch.p_mode=Advanced
and specified my table data source. The result URLs
came up with the right primary key id. However when I
click the URL, I get:
java.lang.ClassCastException: com.evermind[Oracle9iAS (9.0.2.0.0) Containers for J2EE].sql.OrclCallableStatement
at oracle.ultrasearch.query.DisplayUtil.displayTableUrl(DisplayUtil.java:131)
at display.jspService(_display.java:1568) [SRC:/display.jsp:81]
at com.orionserver[Oracle9iAS (9.0.2.0.0) Containers for J2EE].http.OrionHttpJspPage.service(OrionHttpJspPage.java:56)
at oracle.jsp.runtimev2.JspPageTable.service(JspPageTable.java:302)
at oracle.jsp.runtimev2.JspServlet.internalService(JspServlet.java:407)
at oracle.jsp.runtimev2.JspServlet.service(JspServlet.java:330)
at javax.servlet.http.HttpServlet.service(HttpServlet.java:336)
at com.evermind[Oracle9iAS (9.0.2.0.0) Containers for J2EE].server.http.ResourceFilterChain.doFilter(ResourceFilterChain.java:59)
at oracle.security.jazn.oc4j.JAZNFilter.doFilter(JAZNFilter.java:283)
at com.evermind[Oracle9iAS (9.0.2.0.0) Containers for J2EE].server.http.ServletRequestDispatcher.invoke(ServletRequestDispatcher.java:523)
at com.evermind[Oracle9iAS (9.0.2.0.0) Containers for J2EE].server.http.ServletRequestDispatcher.forwardInternal(ServletRequestDispatcher.java:269)
at com.evermind[Oracle9iAS (9.0.2.0.0) Containers for J2EE].server.http.HttpRequestHandler.processRequest(HttpRequestHandler.java:735)
at com.evermind[Oracle9iAS (9.0.2.0.0) Containers for J2EE].server.http.AJPRequestHandler.run(AJPRequestHandler.java:151)
at com.evermind[Oracle9iAS (9.0.2.0.0) Containers for J2EE].util.ThreadPoolThread.run(ThreadPoolThread.java:64)
I have specified NUMBER as the data type for my primary key column and it is of type NUMBER in my table DDL. Is that OK or could this be causing the problem?
DineshDinesh,
Can you provide the following information for creating the table data source:
- Is the table data source based on a table or a view?
- Is the table/view in the local or remote database?
- If the table is in the local database, is the table in the instance owner schema or another schema?
- Do you login to Ultra Search Admin Tool as the instance owner or other users?
- Does the instance owner schema have privileges to read the contents in the table/view? -
Standard business content not extarcting all data
Hello,
I am currently rolling out QN data. The standard business content for 2lis_05Q0task includes FECOD FEGRP & OTEIL (from table QMFE) fields however the data is not being pulled. All the other data is coming in ok. Is there something I need to do to get this data?
Thanks For your help
MireilleHi,
Pls Check in QMFE table if the data is present for these fields.
Take the notification no from the RSA3 extraction and
look for it in the QMFE table.
Rgds,
Ravindra. -
Help to read a table with data source and convert time stamp
Hi Gurus,
I have a req and need to write a ABAP prog. As soon as i excute ABAP program it should ask me enter a data source name, then my ABAP prog has excute teh code, in ABAP code i have to read a table with this data source as key, sort time stamp from table and should display the data source and time stamp as output.
As follows:
Enter Data Source Name:
Then user enters : 2lis_11_vahdr
Then out put should be "Data source :" 10-15-2008.
The time stamp format in table is 20,050,126,031,520 (YYYYMMDDhhmmss). I have to display as 05-26-2005. Any help would be apprciated.
Thanks,
RamHi Jayanthi Babu Peruri,
I tried to extract YEAR, MONTH, DAY separately and using
EDIT MASK written it.
Definitely there will be some STANDARD CONVERSION ROUTINE will be there. But no idea about it.
DATA : V_TS TYPE TIMESTAMP,
V_TS_T TYPE CHAR16,
V_YYYY TYPE CHAR04,
V_MM TYPE CHAR02,
V_DD TYPE CHAR02.
START-OF-SELECTION.
GET TIME STAMP FIELD V_TS.
V_TS_T = V_TS.
CONDENSE V_TS_T.
V_YYYY = V_TS_T.
V_MM = V_TS_T+4(2).
V_DD = V_TS_T+6(2).
V_TS_T(2) = V_MM.
V_TS_T+2(2) = V_DD.
V_TS_T+4(4) = V_YYYY.
SKIP 10.
WRITE : /10 V_TS," USING EDIT MASK '____-__-________'.
/10 V_YYYY,
/10 V_MM,
/10 V_DD,
/10 V_TS_T USING EDIT MASK '__-__-__________'.
If you want DATE alone, just declare the length of V_TS_T as 10.
Regards,
R.Nagarajan.
We can - -
Create a data source and database tables using WSAD
Hi, guys:
the following is from a tutorial:
http://www-106.ibm.com/developerworks/websphere/techjournal/0306_wosnick/wosnick.html
"To create the data source and Cloudscape database tables automatically, right click on the HelloWorldServer in the Servers view, and select the Create tables and data sources menu item. A dialog will then display showing that the data source and database tables were created successfully (Figure 5)."
I am using WSAD 5.0 trial version. I cannot find Create tables and data sources menu item if I right click on the HelloWorldServer in the Servers view. I am wondering if this is because trial version does not have this feature?
regardsThis question is a little off topic but you may get a reply. Please note this forum is about Sun's J2EE SDK and its related technologies. You may have better luck posting your question to an IBM specific resource.
-
Standard business content Data source
Hi Experts,
Could you please let me know, which standard business content Data source will be suitable for the below business attributes?
Stock Transfer Order (STO) Number
STO Creation Date
Delivery Creation Date
STO required Date
Thanks,
GanaHi All,
Please let me know the business content extractor for the BFOK_A Table.
Thanks
Srinivas. -
Create a Generic Data Source and deploy a webservice on a domain using WLST
Hello for everyone,
I usually create generic data sources and deploy webservices on a wls domain by using the Administrator Console. I wonder if you can pass me a script sample to do this task by using WSLT. On the other hand, if you can pass me a "quick start" guide about WLST, this would be very helpful.
Additionally, may I be able to create groups, application roles and users to a domain by using WLST?
Thanks for your help.You can use standard infoobjects as well..
You can create a view on those R/3 tables and use this view for creating a generic datsource in RSO2 transaction..
Check the below pdf for delta in generic datasources
https://www.sdn.sap.com/irj/servlet/prt/portal/prtroot/docs/library/uuid/84bf4d68-0601-0010-13b5-b062adbb3e33 -
Create two or more data sources and mapping to DSO
Hi,
I´m using SAP Netweaver BI 7.0.
If there are two or three data sources which have to be mapped to DSO which field from Data Sources has to be mapped to which field in DSO?
Is it possible to have only one DSO or should it be three DSOs because of the three Data Sources?
The thing is I have created one view out of three tables. For the view I have created one DSO. Everything worked fine. But now the requirements have changed because of currencies.
The view I have created is now mixing currencies because in the view is only one currency field, which is filled from the three tables. Two tables have different currencies and one table has one currency.
The one currency of the one table is important and should stay like it is.
I´m thinking about something like three different currency key fields in DSO which are mapped from data source. Also for every field of data source mapping with corresponding field in DSO.
Some hints? I have found articles in SDN about creating data sources and so on but they don´t help me in this matter.
Thank you in advance!Thank you guys for explaining! I´m new to SAP BW and trying to better understand.
I`ll try it also with the view. It sounds "easier".
But if I´m doing it with the 3 data sources, that means that for each currency field of the data source I will need an corresponding currency field in the DSO for mapping.
Is it possible to have 3 times the 0Currency field in the DSO and each data source currency field will be mapped to the DSO?
If that would work the 0Currency has to be contained in the key figures.
But if the 0Currency is contained in each key figure will the assignment of currency work correct?
+For example:+
Data Source1:
Turnover in Euro.
(All currency is in Euro)
Data Source2:
Profit in Dollar.
Profit in Euro.
Profit in Yen.
Profit in different currencies available.
Data Source3:
Sales in Dollar.
Sales in Euro.
Sales in Yen.
Sales in different currencies available.
For mapping from DataSources to DSO if it is possible to have 3 times 0Currency in DSO:
Data Source1 currency fiield ---> DSO 0Currency
Data Source2 currency fiield ---> DSO 0Currency
Data Source3 currency fiield ---> DSO 0Currency
Does it make sense? -
I added 0AMOUNT in generic data source and in rsa3 i am seeing the data ..b
i added 0AMOUNT in generic data source and in rsa3 i am seeing the data ..but i am not seeing any data in target table..
what would be the causeHi,
I guess you mean the target table in BW, correct?
First replicate your DSource in BW
Open your TRules. In tab Transfer structure/Datasource locate your field in the right pane (should be greyed, not blue); move it from to the left (to the transfer structure); reactivate and reload.
You should now see the field in your PSA table.
hope this helps...
Olivier. -
Dear BI Experts,
We are planning to implement Payment details like check number and date and Bank Name.
Using PAYR table our requirement is matching.
But we are not getting related data source IN R/3.
Please suggest PAYR table data source.
Regards
Ramu.Hi Ramu..
This may help
(0FI_AP_51)
http://help.sap.com/saphelp_nw70/helpdata/EN/db/0345943f9043e083c3ccba1e73b42a/frameset.htm
Regards, -
The error below makes absolutely no sense! I'm using Enterprise Core...yet I'm being told I can't use remote data sources:
w3wp!library!8!03/05/2015-19:08:48:: i INFO: Catalog SQL Server Edition = EnterpriseCore
w3wp!library!8!03/05/2015-19:08:48:: e ERROR: Throwing Microsoft.ReportingServices.Diagnostics.Utilities.OperationNotSupportedException: , Microsoft.ReportingServices.Diagnostics.Utilities.OperationNotSupportedException: The feature: "The edition of Reporting
Services that you are using requires that you use local SQL Server relational databases for report data sources and the report server database." is not supported in this edition of Reporting Services.;
Really? This totally contradicts the documentation found here:
https://msdn.microsoft.com/en-us/library/ms157285(v=sql.110).aspx
That article says remote connections are completely supported.
ARGH! Why does this have to be so difficult to setup?!?Hi jeffoliver1000,
According to your description, you are using Enterprise Core edition and you are prompted that you can’t use remote data sources.
In your scenario, we neither ignore your point nor be doubt with what you say. But actually we have met the case before that even though the SQL Server engine is Enterprise but the reporting services is still standard. So I would recommend you to find the
actual edition of reporting services you are using. You can find Reporting Services starting SKU in the Reporting Service logs ( default location: C:\Program Files\Microsoft SQL Server\<instance name>\Reporting Services\LogFiles). For more information,
please refer to the similar thread below:
https://social.technet.microsoft.com/Forums/en-US/f98c2f3e-1a30-4993-ab41-acbc5014f92e/data-driven-subscription-button-not-displayed?forum=sqlreportingservices
By the way, have you installed the other SQL Server edition before?
Best regards,
Qiuyun Yu
Qiuyun Yu
TechNet Community Support -
Generic Data Source and InfoPackage Data Selection
Hello, I'm trying to extract data from a generic data source (composed of a view joining 4 tables) using the Data Selection Criteria in an infopackage. Sometimes the selection criteria is ignored and more data is pulled than expected. The number of selectable items in the generic data source has been reduced (in case this was an issue) and for a short time the infopackaged pulled the expected number of records. Then another change was made to the generic data source and the problem returned. In any case, the transported generic data source does not work in the target system either. I'm not sure what is causing this to happen. Any ideas??? BW 3.5.
Regards,
Sue
Message was edited by: Sue and Enrique
Ramesh PHi,
Some of the datasources will not have default Infopackages for them.You can create the infopackages according to the requirement.There is no big deal for creating Infopackages.You can go ahead creating them.
Regards
Karthik -
Are both Data Source and Data Store the same in BI? If not can someone explain what each one of these terms mean.
Thanks for the helpData Source or Persistent Staging Area is a transparent database table or initial store in BI. In this table the requested data is saved unchanged from the Source System.
DataStore Objects are primary physical database storage objects used in BI. They are designed to store very detailed transactional level records.
Thanks -
Goldengate Extracts reads slow during Table Data Archiving and Index Rebuilding Operations.
We have configured OGG on a near-DR server. The extracts are configured to work in ALO Mode.
During the day, extracts work as expected and are in sync. But during any dialy maintenance task, the extracts starts lagging, and read the same archives very slow.
This usually happens during Table Data Archiving (DELETE from prod tables, INSERT into history tables) and during Index Rebuilding on those tables.
Points to be noted:
1) The Tables on which Archiving is done and whose Indexes are rebuilt are not captured by GoldenGate Extract.
2) The extracts are configured to capture DML opeartions. Only INSERT and UPDATE operations are captured, DELETES are ignored by the extracts. Also DDL extraction is not configured.
3) There is no connection to PROD or DR Database
4) System functions normally all the time, but just during table data archiving and index rebuild it starts lagging.
Q 1. As mentioned above, even though the tables are not a part of capture, the extracts lags ? What are the possible reasons for the lag ?
Q 2. I understand that Index Rebuild is a DDL operation, then too it induces a lag into the system. how ?
Q 3. We have been trying to find a way to overcome the lag, which ideally shouldn't have arised. Is there any extract parameter or some work around for this situation ?Hi Nick.W,
The amount of redo logs generated is huge. Approximately 200-250 GB in 45-60 minutes.
I agree that the extract has to parse the extra object-id's. During the day, there is a redo switch every 2-3 minutes. The source is a 3-Node RAC. So approximately, 80-90 archives generated in an hour.
The reason to mention this was, that while reading these archives also, the extract would be parsing extra Object ID's, as we are capturing data only for 3 tables. The effect of parsing extract object id's should have been seen during the day also. The reason being archive size is same, amount of data is same, the number of records to be scanned is same.
The extract slows down and read at half the speed. If normally it would take 45-50 secs to read an archive log of normal day functioning, then it would take approx 90-100 secs to read the archives of the mentioned activities.
Regarding the 3rd point,
a. The extract is a classic extract, the archived logs are on local file system. No ASM, NO SAN/NAS.
b. We have added "TRANLOGOPTIONS BUFSIZE" parameter in our extract. We'll update as soon as we see any kind of improvements.
Maybe you are looking for
-
I am using a table which contains a text field with a lookup. I want to use the selected value of this field in a form which is acting as a selection form. No editing of the field's value is permitted. How do I just display the value of the field (wh
-
Every time I try to install the 10.6.8 update my iMac boots a Kernel Panic and I have to do a recovery and patches all the way back up to 10.6.7 Then when I try the 10.6.8 update again, I get another Kernel Panic. It takes hours and hours to download
-
I have configured interest calculation for customers. the customers are given 30 days grace period in addition to the payment terms. e.g. payment terms are payable within 30 days and the grace days are 30 days. In total the customer has 60 days for w
-
I have a zen stone with external speaker that creative sent me refurbished after I sent my new one back for repair after owning it for only 34 days. I cannot switch between folders. I plug it into the computer and there are 2 folders there labeled -
-
How can I import CDs and mp3s from my externals device to Itunes?
Hello, I have got a real problem with ITunes. It does not import any data neither from CDs nor externals drives (mp3-drive). It is nearly impossible to do so because I haven't got the access rights. Thank you for your help. Mathias