Data load / Collection of R/3 Table filelds to BW
Hi,
Here is the situation...I have to pull data from sap R/3 tables like VBRP, VBAK,VBKD TO BW on monthly basis. Can anyone of you gurus tell me what I have to do, to load these data from R/3 to cube in BW. Can you please help us by step by step procedure in detail way like the Basic steps we have to do and steps to do in BW..
Thanks,
David
Hi David,
You could do this in 2 ways:
Either by using SAP delivered standard data sources i.e. 2LIS_13_VDITM and 2LIA_11_VAHDR. Or by creating Generic extractor either using Views or function modules.
If you decide to use SAP delivered standard extractors then you can see them in RSA5. Activate those and you can use them. You can test in RSA3 extract checker.
If you decide to use generic extractor then go to RSO2 and create a data source of your own with necessary fields.
Once this is done in R/3 side, you can check the data in RSA3 extractor checker.
Now in BW side replicate the data source in your source system.
Create transfer structure, communication structure then cube. Join the cube and transfer structure with update rules. Mappings should be clear.
Create info package and schedule the job.
Hope this helps you,
Bye,
Naga.
Similar Messages
-
Very slow performance in every area after massive data load
Hi,
I'm new to Siebel. I had a call from customer saying that virtually every aspect of the application (login, etc) is slow after they did a massive data loading ~ around 15GB of data.
Could you please help to point out what would be the best practice for this massive data loading exercise? All the table statistics are up to date.
Anyone encountered this kind of problem before?Hello,
Siebel CRM is a highly customizable customer relationship management solution. There are number of customizations (scripting, workflow, web services,...) and integrations (custom c++, java, ERP system,...) that can cause Siebel performance issues.
Germain Monitoring v1.8.5 can help you -clean-up- all your siebel performance issues (5 min after installation, which can take between 4hours and 10 days whether it is to be used against your siebel dev/qa or prod environment) and then monitor your siebel production system at every layer of your infrastructure, at the siebel user click & back-end transaction levels and either solve or identify the root-cause of siebel performance issues, 24x7.
Germain Monitoring Software (currently version 1.8.5) helps siebel customers 1)faster solve siebel performance issues introduced by customizations and 2)effectively solve siebel performance issues before business is impacted once siebel is on production.
Customers like NetApp, J.M Smucker, Alltel/Verizon,...have saved hundred of thousands of dollars using Germain Monitoring software.
Let us know whether you would like to discuss this further...good luck w/ these issues,
Regards,
Yannick Germain
GERMAIN SOFTWARE LLC
Siebel Performance Software
21 Columbus Avenue, Suite 221
San Francisco, CA 94111, USA
Cell: +1-415-606-3420
Fax: +1-415-651-9683
[email protected]
http://www.germainsoftware.com -
Issue:
I have SAP BW system and SAP HANA System
SAP BW to SAP HANA connecting through a DB Connection (named HANA)
Whenever I created any Open Hub as Destination like DB Table with the help of DB Connection, table will be created at HANA Schema level ( L_F50800_D )
Executed the Open Hub service without checking DELETING Data from table option
Data loaded with 16 Records from BW to HANA same
Second time again executed from BW to HANA now 32 records came ( it is going to append )
Executed the Open Hub service with checking DELETING Data from table option
Now am getting short Dump DBIF_RSQL_TABLE_KNOWN getting
If checking in SAP BW system tio SAP BW system it is working fine ..
will this option supports through DB Connection or not ?
Please follow the attachemnet along with this discussion and help me to resolve how ?
From
Santhosh KumarHi Ramanjaneyulu ,
First of all thanks for the reply ,
Here the issue is At OH level ( Definition Level - DESTINATION TAB and FIELD DEFINITION )
in that there is check box i have selected already that is what my issue even though selected also
not performing the deletion from target level .
SAP BW - to SAP HANA via DBC connection
1. first time from BW suppose 16 records - Dtp Executed -loaded up to HANA - 16 same
2. second time again executed from BW - now hana side appaended means 16+16 = 32
3. so that i used to select the check box at OH level like Deleting data from table
4. Now excuted the DTP it throws an Short Dump - DBIF_RSQL_TABLE_KNOWN
Now please tell me how to resolve this ? will this option is applicable for HANA mean to say like , deleting data from table option ...
Thanks
Santhosh Kumar -
Is it possible to upload few column in table through Apex Data Loading
Hi All,
I have to do upload into the table through a csv file . The table's primary key i have to load the rest through user's uploaded file. Is it possible to do the data loading to the table only to required columns and fill the other columns from backend. Or is there any other way to do this?Hi,
Your query is not really clear.
>
Is it possible to do the data loading to the table only to required columns and fill the other columns from backend. Or is there any other way to do this?
>
How do you plan to "link" the rows from these 2 sets of data in the "backend"? There has to be a way to have a relation between them.
Regards, -
Loading data from .csv file into Oracle Table
Hi,
I have a requirement where I need to populate data from .csv file into oracle table.
Is there any mechanism so that i can follow the same?
Any help will be fruitful.
Thanks and regardsYou can use Sql Loader or External tables for your requirement
Missed Karthick's post ...alredy there :)
Edited by: Rajneesh Kumar on Dec 4, 2008 10:54 AM -
Loading data from .csv file into existing table
Hi,
I have taken a look at several threads which talk about loading data from .csv file into existing /new table. Also checked out Vikas's application regarding the same. I am trying to explain my requirement with an example.
I have a .csv file and I want the data to be loaded into an existing table. The timesheet table columns are -
timesheet_entry_id,time_worked,timesheet_date,project_key .
The csv columns are :
project,utilization,project_key,timesheet_category,employee,timesheet_date , hours_worked etc.
What I needed to know is that before the csv data is loaded into the timesheet table is there any way of validating the project key ( which is the primary key of the projects table) with the projects table . I need to perform similar validations with other columns like customer_id from customers table. Basically the loading should be done after validating if the data exists in the parent table. Has anyone done this kind of loading through the APEX utility-data load.Or is there another method of accomplishing the same.
Does Vikas's application do what the utility does ( i am assuming that the code being from 2005 the utility was not incorporated in APEX at that time). Any helpful advise is greatly appreciated.
Thanks,
AnjaliHi Anjali,
Take a look at these threads which might outline different ways to do it -
File Browse, File Upload
Loading CSV file using external table
Loading a CSV file into a table
you can create hidden items in the page to validate previous records before insert data.
Hope this helps,
M Tajuddin
http://tajuddin.whitepagesbd.com -
How to load the data from .csv file to oracle table???
Hi,
I am using oracle 10g , plsql developer. Can anyone help me in how to load the data from .csv file to oracle table. The table is already created with the required columns. The .csv file is having about 10lakh records. Is it possible to load 10lakh records. can any one please tell me how to proceed.
Thanks in advance981145 wrote:
Can you tell more about sql * loader??? how to know that utility is available for me or not??? I am using oracle 10g database and plsql developer???SQL*Loader is part of the Oracle client. If you have a developer installation you should normally have it on your client.
the command is
sqlldrType it and see if you have it installed.
Have a look also at the FAQ link posted by Marwin.
There are plenty of examples also on the web.
Regards.
Al -
While loading transaction data into cube what are the tables generats
Hi,
while loading transaction data into cube what are the tables normally generats.Hi,
Normally datas will be loading to 'F'- Fact tables (/BIC/F****) *** - cube name..
When you do compress the request the data will be moved into E tables.
Regards,
Siva. -
Data Load from XML file to Oracle Table
Hi,
I am trying to load data from XML file to Oracle table using DBMS_XMLStore utility.I have performed the prerequisites like creating the directory from APPS user, grant read/write to directory, placing the data file on folder on apps tier, created a procedure ‘insertXML’ to load the data based on metalink note (Note ID: 396573.1 How to Insert XML by passing a file Instead of using Embedded XML). I am running the procedure thru below anonymous block to insert the data in the table.
Anonymous block
declare
begin
insertXML('XMLDIR', 'results.xml', 'employee_results');
end;
I am getting below error after running the anonymous block.
Error : ORA-22288: file or LOB operation FILEOPEN failed”
Cause : The operation attempted on the file or LOB failed.
Action: See the next error message in the error stack for more detailed
information. Also, verify that the file or LOB exists and that
the necessary privileges are set for the specified operation. If
the error still persists, report the error to the DBA.
I searched this error on metalink and found DOC ID 1556652.1 . I Ran the script provided in the document. PFA the script.
Also, attaching a document that list down the steps that I have followed.
Please check and let me know if I am missing something in the process. Please help to get this resolve.
Regards,
SankalpThanks Bashar for your prompt response.
I ran the insert statement but encountered error,below are the error details. statement.
Error report -
SQL Error: ORA-22288: file or LOB operation FILEOPEN failed
No such file or directory
ORA-06512: at "SYS.XMLTYPE", line 296
ORA-06512: at line 1
22288. 00000 - "file or LOB operation %s failed\n%s"
*Cause: The operation attempted on the file or LOB failed.
*Action: See the next error message in the error stack for more detailed
information. Also, verify that the file or LOB exists and that
the necessary privileges are set for the specified operation. If
the error still persists, report the error to the DBA.
INSERT statement I ran
INSERT INTO employee_results (USERNAME,FIRSTNAME,LASTNAME,STATUS)
SELECT *
FROM XMLTABLE('/Results/Users/User'
PASSING XMLTYPE(BFILENAME('XMLDIR', 'results.xml'),
NLS_CHARSET_ID('CHAR_CS'))
COLUMNS USERNAME NUMBER(4) PATH 'USERNAME',
FIRSTNAME VARCHAR2(10) PATH 'FIRSTNAME',
LASTNAME NUMBER(7,2) PATH 'LASTNAME',
STATUS VARCHAR2(14) PATH 'STATUS'
Regards,
Sankalp -
Comparison of Data Loading techniques - Sql Loader & External Tables
Below are 2 techniques using which the data can be loaded from Flat files to oracle tables.
1) SQL Loader:
a. Place the flat file( .txt or .csv) on the desired Location.
b. Create a control file
Load Data
Infile "Mytextfile.txt" (-- file containing table data , specify paths correctly, it could be .csv as well)
Append or Truncate (-- based on requirement) into oracle tablename
Separated by "," (or the delimiter we use in input file) optionally enclosed by
(Field1, field2, field3 etc)
c. Now run sqlldr utility of oracle on sql command prompt as
sqlldr username/password .CTL filename
d. The data can be verified by selecting the data from the table.
Select * from oracle_table;
2) External Table:
a. Place the flat file (.txt or .csv) on the desired location.
abc.csv
1,one,first
2,two,second
3,three,third
4,four,fourth
b. Create a directory
create or replace directory ext_dir as '/home/rene/ext_dir'; -- path where the source file is kept
c. After granting appropriate permissions to the user, we can create external table like below.
create table ext_table_csv (
i Number,
n Varchar2(20),
m Varchar2(20)
organization external (
type oracle_loader
default directory ext_dir
access parameters (
records delimited by newline
fields terminated by ','
missing field values are null
location ('file.csv')
reject limit unlimited;
d. Verify data by selecting it from the external table now
select * from ext_table_csv;
External tables feature is a complement to existing SQL*Loader functionality.
It allows you to –
• Access data in external sources as if it were in a table in the database.
• Merge a flat file with an existing table in one statement.
• Sort a flat file on the way into a table you want compressed nicely
• Do a parallel direct path load -- without splitting up the input file, writing
Shortcomings:
• External tables are read-only.
• No data manipulation language (DML) operations or index creation is allowed on an external table.
Using Sql Loader You can –
• Load the data from a stored procedure or trigger (insert is not sqlldr)
• Do multi-table inserts
• Flow the data through a pipelined plsql function for cleansing/transformation
Comparison for data loading
To make the loading operation faster, the degree of parallelism can be set to any number, e.g 4
So, when you created the external table, the database will divide the file to be read by four processes running in parallel. This parallelism happens automatically, with no additional effort on your part, and is really quite convenient. To parallelize this load using SQL*Loader, you would have had to manually divide your input file into multiple smaller files.
Conclusion:
SQL*Loader may be the better choice in data loading situations that require additional indexing of the staging table. However, we can always copy the data from external tables to Oracle Tables using DB links.Please let me know your views on this.
-
Load data from flat file to target table with scheduling in loader
Hi All,
I have requirement as follows.
I need to load the data to the target table on every Saturday. My source file consists of dataof several sates.For every week i have to load one particular state data to target table.
If first week I loaded AP data, then second week on Saturday karnatak, etc.
can u plz provide code also how can i schedule the data load with every saturday with different state column values automatically.
Thanks & Regards,
SekharThe best solution would be:
get the flat file to the Database server
define an External Table pointing to that flat file
insert into destination table(s) as select * from external_table
Loading only single state data each Saturday might mean troubles, but assuming there are valid reasons to do so, you could:
create a job with a p_state parameter executing insert into destination table(s) as select * from external_table where state = p_state
create a Scheduler chain where each member runs on next Saturday executing the same job with a different p_stateparameter
Managing Tables
Oracle Scheduler Concepts
Regards
Etbin -
Hi All,
i am facing issue with apex 4.2.4 ,using the Data Load Table concept's and in this look up used the
Where Clause option ,it seems to be not working this where clause ,Please help me on thishi all,
it looks this where clause not filter with 'N' data ,Please help me ,how to solve this or help me on this -
SQL loaded not loading data from csv format to oracle table.
Hi,
I trying to load data from flat files to oracle table,it not loading decimal values.Only character type of data is loaded and number format is null.
CONTROL FILE:
LOAD DATA
INFILE cost.csv
BADFILE consolidate.bad
DISCARDFILE Sybase_inventory.dis
INSERT
INTO TABLE FIT_UNIX_NT_SERVER_COSTS
FIELDS TERMINATED BY ',' optionally enclosed by '"'
TRAILING NULLCOLS
HOST_NM,
SERVICE_9071_DOLLAR FLOAT,
SERVICE_9310_DOLLAR FLOAT,
SERVICE_9700_DOLLAR FLOAT,
SERVICE_9701_DOLLAR FLOAT,
SERVICE_9710_DOLLAR FLOAT,
SERVICE_9711_DOLLAR FLOAT,
SERVICE_9712_DOLLAR FLOAT,
SERVICE_9713_DOLLAR FLOAT,
SERVICE_9720_DOLLAR FLOAT,
SERVICE_9721_DOLLAR FLOAT,
SERVICE_9730_DOLLAR FLOAT,
SERVICE_9731_DOLLAR FLOAT,
SERVICE_9750_DOLLAR FLOAT,
SERVICE_9751_DOLLAR FLOAT,
GRAND_TOTAL FLOAT
In table FLOAT are replaced by number(20,20)
SAmple i/p from csv:
ABOS12,122.46,,1315.00,,1400.00,,,,,,,,1855.62,,4693.07
Only abos12 is loaded and rest of the number are not loaded.
Thanks.Hi,
Thanks for your reply.
Its throwing error.
Number to load: ALL
Number to skip: 0
Errors allowed: 50
Bind array: 64 rows, maximum of 256000 bytes
Continuation: none specified
Path used: Conventional
Table FIT_UNIX_NT_SERVER_COSTS, loaded from every logical record.
Insert option in effect for this table: INSERT
TRAILING NULLCOLS option in effect
Column Name Position Len Term Encl Datatype
HOST_NM FIRST 255 , O(") CHARACTER
SERVICE_9071_DOLLAR NEXT 255 , O(") CHARACTER
SERVICE_9310_DOLLAR NEXT 255 , O(") CHARACTER
SERVICE_9700_DOLLAR NEXT 255 , O(") CHARACTER
SERVICE_9701_DOLLAR NEXT 255 , O(") CHARACTER
SERVICE_9710_DOLLAR NEXT 255 , O(") CHARACTER
SERVICE_9711_DOLLAR NEXT 255 , O(") CHARACTER
SERVICE_9712_DOLLAR NEXT 255 , O(") CHARACTER
SERVICE_9713_DOLLAR NEXT 255 , O(") CHARACTER
SERVICE_9720_DOLLAR NEXT 255 , O(") CHARACTER
SERVICE_9721_DOLLAR NEXT 255 , O(") CHARACTER
SERVICE_9730_DOLLAR NEXT 255 , O(") CHARACTER
SERVICE_9731_DOLLAR NEXT 255 , O(") CHARACTER
SERVICE_9750_DOLLAR NEXT 255 , O(") CHARACTER
SERVICE_9751_DOLLAR NEXT 255 , O(") CHARACTER
GRAND_TOTAL NEXT 255 , O(") CHARACTER
value used for ROWS parameter changed from 64 to 62
Record 1: Rejected - Error on table FIT_UNIX_NT_SERVER_COSTS, column GRAND_TOTAL.
ORA-01722: invalid number
Record 2: Rejected - Error on table FIT_UNIX_NT_SERVER_COSTS, column GRAND_TOTAL.
ORA-01722: invalid number
Record 3: Rejected - Error on table FIT_UNIX_NT_SERVER_COSTS, column GRAND_TOTAL.
ORA-01722: invalid number
Record 4: Rejected - Error on table FIT_UNIX_NT_SERVER_COSTS, column GRAND_TOTAL.
ORA-01722: invalid number
Record 5: Rejected - Error on table FIT_UNIX_NT_SERVER_COSTS, column GRAND_TOTAL.
ORA-01722: invalid number
Record 6: Rejected - Error on table FIT_UNIX_NT_SERVER_COSTS, column GRAND_TOTAL.
ORA-01722: invalid number
Record 7: Rejected - Error on table FIT_UNIX_NT_SERVER_COSTS, column GRAND_TOTAL.
ORA-01722: invalid number
Record 8: Rejected - Error on table FIT_UNIX_NT_SERVER_COSTS, column GRAND_TOTAL.
ORA-01722: invalid number
Record 9: Rejected - Error on table FIT_UNIX_NT_SERVER_COSTS, column GRAND_TOTAL.
ORA-01722: invalid number
Record 10: Rejected - Error on table FIT_UNIX_NT_SERVER_COSTS, column GRAND_TOTAL.
ORA-01722: invalid number
Record 11: Rejected - Error on table FIT_UNIX_NT_SERVER_COSTS, column GRAND_TOTAL.
ORA-01722: invalid number
Please help me on it.
Thanks, -
Where Clause in Table Lookups for Data Load
Hello,
In Shared Components I created in Data Load Table. In this Data Load Table I added a Table Lookup. On the page to edit the Table Lookup, there is a field called Where Clause. I tried to add a Where Clause to my Table Lookup in this field but it seems that it has no effect on the Data Load process.
Does someone know how to use this Where Clause field?
Thanks,
SebHi,
I'm having the same problem with the where clause being ignored in the table lookup, is this a bug and if so is there a work around?
Thanks in advance -
How to update existing table using Data Load from spreadsheet option?
Hi there,
I need to update an existing table, but in Data Load application when you select csv file to upload it inserts all data by replacing existing one. How can i change this?
Let me know,
Thank you.
A.B.A.And how do you expect your database server to access a local file in your machine ?
Is the file accessible from outside your machine say inside a webserver folder so that some DB process can poll on the file ?
Or, is your DB server in the same machine where you have the text file ?
You will have to figure out the file acess part before automating user interaction or even auto-refreshing.
Maybe you are looking for
-
Im using the most recent version of Lightroom 5.3 (updated through CC - even though this has shown up on previous version) on Windows 7 and all the folders in my library are showing up with the directory path in front of my folder names. How can I ge
-
Not able to run the Jdeveloper Pages in Local Machine
Dear All, I am developing Custom OAF Pages in R12. while am not able to run the page in Local Machine Currently i am using Remote VPN. same page i import into server am able to run in Application. The error says: 1, Increae the time limit bye setting
-
ENT-06954: Error while Displaying BIBean for Cube/Dimension Dataviewer.
Hi all, I have defined a cube with three dimensions. All elements are deployed and the mappings is executed successfully. If I try to open the dataviewer either for the cube or for the dimensions I receive the following errors: ENT-06954: Error while
-
Problem with receiver RFC.
Hello all, I am getting below exception does anyone came across such exception, Please help me in getting out of this. Delivering the message to the application using connection RFC_http://sap.com/xi/XI/System failed, due to: com.sap.engine.interface
-
How to delete blank pages in pdf
How do you delete blank pages in a PDF?