Bulk Data
Do we have any place form where we can get bulk data for practising .The data should represent some sort of schema object relationship .Bulk means the number of rows should be in crores. For closely working on explain plans with bulk data.
Random data will only get you so far. It's fine for some types of bulk testing but it has two major flaws:
1) A lot of performance issues derive from skew in our data. Randomly generatedly values , while exhibiting clumps of values is unlikley to have the extremes of data distribution which we see in real data. This includes things like variation in string length.
2) Generating keys are difficult. Sure, we can generate unique numeric keys with ROWNUM but other types or uniqueness are harder and wrangling foreign key relationships is a complete haemorrhoid.
So, what to do? Well there are a number of data sets out. The best place to look is [url http://www.infochimps.com/datasets]InfoChimps. This used to be a really great site but the company is (not unreasonably) seeking to make money from theit efforts, so they restrict now access to a lot of their data sets. Nevertheless many sets are free (although reigistration is required) or else just links to externally hosted public data sets.
Most of the data sets are CSVs, so there is a certain amount of work required to get them into a database. However, it's not too difficult with external tables, and that is also a useful training in its own right.
Cheers, APC
Similar Messages
-
Error while retrieving bulk data from mdm in staging server
I am getting an error while retrieving bulk data from MDM. The error goes like this "NiRawReadError: An error occured when reading data from socket.".
Could anyone please suggest me about the possible cause of this error. Please reply soon.
Moderator message: network error, relation to ABAP development unclear, please search for available answers, check with system administrators, consult SAP service marketplace.
Edited by: Thomas Zloch on Nov 22, 2010 5:16 PMCan you elaborate the thing... I don't think /*+ APPEND */ this is working for me ,still I am getting same error.
If you have any other suggestion,I would like to hear.
Should i not put commit after some 500 records inserted ? As i am putting commit once after whole data gets inserted. -
Performance problems on bulk data importing
Hello,
We are importing 3.500.000 customers and 9.000.000 sales orders from an
external system to CRM system initialy. We developed abap programs
those use standart "bapi" functions to import bulk data.
We have seen that this process will take a lot of time to finish
approximately in 1,5 months. It is a very long time for us to wait it
to be finished. We want to complete this job about in a week.
Have we done something wrong? For example, are there another fast and
sap standard ways to import bulk partners and sales orders without
developing abap programs?
best regards,
Cuneyt TektasHi Cuneyt,
SAP standard supports import from external source. You can use XIF adapter or you can also use ECATT.
Thanks,
Vikash. -
Optimization for bulk data upload
Hi everyone!
I've got the following issue:
I have to do a bulk data upload using JMS deploy in a glassfish 2.1, to process and validate data in a Oracle 10g DB System before it is insert.
I have my web interface that loads a file and then delegates the process to a Stateless Session Bean which read a N number of lines and after that send a message to a JMS Queue. The JMS has to parse the each line, validate them with the data already in the DB, and finally persist the new data.
This process is high process consuming, and I need to improve the performance time. I tried to change glassfish default JMS and JDBC pool size, but I have no a huge difference.
Do you have any advice that could help me?
Thanks in advance!Hi! thank you for you answer!
High process consuming is in the MDB
I'm grouping each N number of read lines in the EJB and then send the message to the JMS. The MDB process the persists each line as info in different related tables.
Thanks again! -
PeopleSoft CS SAIP Announce Status Issue in Bulk Data Exchange Status
XML is generated in the provided Directory Path under SAIP “Web Service Targets” but “Announce Status” is blank under Bulk Data Exchange Status, Even the “Event Message Monitor” shows nothing!
We have activated all SAIP service operations and their delivered routings on our side.
The Transaction status under Bulk Data Exchange Status page says Announced but but “Announce Status” is blank on the same page.
Announce status should have any of these possible values per PeopleBooks (Connector Error,Failure,Processing,Success,Unsupported)
What could be wrong? Please help. Thank You...
Regards,
AshishYou are welcome. I'm glad you got it back up.
(1) You say you did the symbolic link. I will assume this is set correctly; it's very important that it is.
(2) I don't know what you mean by "Been feeding the [email protected] for several weeks now, 700 emails each day at least." After the initial training period, SpamAssassin doesn't learn from mail it has already processed correctly. At this point, you only need to teach SpamAssassin when it is wrong. [email protected] should only be getting spam that is being passed as clean. Likewise, [email protected] should only be getting legitimate mail that is being flagged as junk. You are redirecting mail to both [email protected] and [email protected] ... right? SpamAssassin needs both.
(3) Next, as I said before, you need to implement those "Frontline spam defense for Mac OS X Server." Once you have that done and issue "postfix reload" you can look at your SMTP log in Server Admin and watch as Postfix blocks one piece of junk mail after another. It's kind of cool.
(4) Add some SARE rules:
Visit http://www.rulesemporium.com/rules.htm and download the following rules:
70sareadult.cf
70saregenlsubj0.cf
70sareheader0.cf
70sarehtml0.cf
70sareobfu0.cf
70sareoem.cf
70sarespoof.cf
70sarestocks.cf
70sareunsub.cf
72sare_redirectpost
Visit http://www.rulesemporium.com/other-rules.htm and download the following rules:
backhair.cf
bogus-virus-warnings.cf
chickenpox.cf
weeds.cf
Copy these rules to /etc/mail/spamassassin/
Then stop and restart mail services.
There are other things you can do, and you'll find differing opinions about such things. In general, I think implementing the "Frontline spam defense for Mac OS X Server" and adding the SARE rules will help a lot. Good luck! -
How to insert bulk data into ms-access
Hi,
I am trying to insert bulk data into ms-access. I used Statement it is
working fine but not allowing to insert single quote. Then I tryed with
PreparedStatement which is allowing single quote but not allowing bulk data. The following error i am getting.
javax.servlet.ServletException: [Microsoft][ODBC Microsoft Access Driver]String data, right truncated (null)
please help me..
guruhave u tried out the memo datatype in access?
-
Enhancement_CIN_Capture Incoming Excise Invoice-J1IEX bulk data upload
Dear All,
Sub:CIN_Capture Incoming Excise Invoice-J1IEX bulk data upload option requirement
We are capturing the Incoming excise invoices manually in the
transaction J1IEX with huge datau2019s and according to the volume of data
it is very difficult for us to enter manually and now we required for
the option of bulk data processing to upload the data from the Excel
file(received the softcopy from the supplier).
As per our observations we found the BAPI named
BAPI_EXCINV_CREATE_FROMDATA but the update level of this BAPI is not
available in our system because as per the Indian Government norms one
ofthe current Excise Duty Tariff is as below
1. Basic Excise Duty (BED 8%).
2. Education Cess (ECess 2%)
3. Secondary Education Cess (SECess 1%)
and we observed the SECess (1%) is not covered in the above mentioned
BAPI so we are not in a position to proceed further.
So Kindly update us is any other relevant option will solve the purpose.
We are in a quite difficult situation to uplaod the datas to our system
so please do the needful.
Regards,
PrabuMPlease note that CIN uses the 'MB_MIGO_BADI' definition and 'CIN_PLUG_IN_TO_MIGO' is the implementation. You can create multiple implementations of this BADI. The same BADI is used for single step Capture & Post of excise invoice in MIGO. Kindly use this BADI as per your needs. SAP std does not support any BAPIs for Goods Receipts with Excise updates
-
To upload bulk data through FI transation F-02
Hey All,
The requirement here is to upload bulk data through FI transation F-02.
Using a BDC for the data upload is not suggested for this transaction as the Screen Sequence changes as per the values of the Posting Key ( and in some rare cases according to value of Account ).
Authorisations for LSMW havent been provided at this site.
I looked up SDN which showed many threads in which it was agreed that coding a BDC to handle such a dynamic sequence of screens is very complex. Some people suggested BAPIs as an alternative. Namely - BAPI_ACCT_DOCUMENT_POST and BAPI_TRANSACTION_COMMIT .
But, when I searched for the BAPI BAPI_ACCT_DOCUMENT_POST in se37, it did not exist !! The SAP version here is 4.6c
Any suggestions ?
-ashrut .
Extra Info -
The posting keys I have to use are - GL Account Debit, GL Account Credit , Vendor Debit, Vendor Credit .
And there is the special case of a GL Account mapping to a Sales Order No for the GL Account Posting Keys.
SDN Links - BDC for FB01
http://sap.ittoolbox.com/groups/technical-functional/sap-acct/prog-rfbibl00-230755?cv=expanded#When we implemented, we used this program to upload all of our financial activity. We did not use LSMW and never have. We continue to use the program for interfaces. It is well documented, so check that out.
Rob -
Upload bulk data into sap?
hi all,
let me know is there any methods to upload bulk data into sap and can same data be modified , delete , add. please correct me if i am wrong. what i know is that we can do with lsmw method, as i am new to lsmw method please let me know where to hunt for lsmw documentation.
thanks,
john dias.Hi John-
According to SAP, The Data Transfer Workbench supports the automatic transfer of data into the system. The Workbench is particularly useful for various business objects with large amounts of data. It guarantees that data is transferred efficiently and ensures that data in the system is consistent.
Further, The Legacy System Migration Workbench (LSMW) is a tool recommended by SAP that you can use to transfer data once only or periodically from legacy systems into an R/3 System.
For your purpose you might be helped by the following two links-
'Data Transfer Workbench' - http://help.sap.com/saphelp_47x200/helpdata/en/0d/e211c5543e11d1895d0000e829fbbd/frameset.htm
'Using the LSM Workbench for the Conversion' - http://help.sap.com/saphelp_46c/helpdata/en/0f/4829e3d30911d3a6e30060087832f8/frameset.htm
Hope these links help,
- Vik. -
Hi
We have a requirement to load bulk data which would be a full dump (and not incremental) in CSV format almost every week from other applications.
This implies that I can drop my tables and rebuild the same using the CSV files that I have received.
I was just wondering is there is any real efficient tool or utility in ORacle (or outside) to import huge amount of data (apart from SQL Loader, Ext Tables and Data Pump)
Regards
KapilI don't know of any tool apart from loader/Ext-table and Datapump.
You may find tools which you can buy (and claim they are really good).
Honestly, if you want to load flat file data (gigabytes or kilobytes) into Oracle, there is nothing better than SQL*loader, "if you use all its capabilities" (External tables and loader are same thing, just the wrapper is different).
Cheers -
How to do Bulk data transfer using Web Service
In my application I have to write various web services but majority of the web service has to
query database and return back bulk data(rows>10K) through web service.
So I would like to ask what is the efficient way of transferring bulk data using web service as presently
Iam returning the dataset as xml String (using StringBuilder) from web service and consuming the same at client end.
Is there a better way to this in web service?
My env:
Front end can be in any other technology ,UI like C#
Back end : Tomcat 6 on Java 6 with Axis2
Thanks in advanceInnova wrote:
But then also I have to mention a return type although its xml that is getting transferred.
Can you provide me a with a small sample code.Like if I have Emp obect with properties
Empname,EmpID,DOJ,DOB,Dpt . In this case what will be the return type.
My major concern is that my resultset is going to be huge in terms of >10,000 rows so
that much time for formation of xml and then the transfer so how can I reduce the transfer
time ,I meant a faster optimised approach for transferring large data as part of web service.
Simply putting , how can I transfer bulk data in minimum time so that I can reduce the
waiting time at client side for the response.
Thanks in advanceI've done this with thousands before now, and not had a performance problem...at least nothing worth writing home about.
Your return type will be an array of Emp objects. You can use a SOAP array for that (you'll need to look that up for the syntax, since I can't remember it off the top of my head), which gets restricted to being an array of Emp.
Should the normal technique prove to be too slow then you should look at alternatives, but I would expect this to work well enough. And you have no faffing about with devising a new transfer system that then has to be explained to your client system...it'll all be standard. -
Uploading bulk data in v_512w_d
Hi gurus,
I have to upload bulk data in table V_512W_D.
Plase let me know the best way to do it..
Thanks,
AdeshHi,
We had a similar requirement.
At that time what we did was, we had downloaded the data from different table from source system.
We have created a program in Development Server to upload the data into target system Development.
And transported the table with data into Quality and Production.
Hope this helps.
Pradeep. -
Transfer bulk data using replication
Hi,
We are having transactional replication setup between two database where one is publisher and other is subscriber. We are using push subscription for this setup.
Problem comes when we have a bulk data updates on the publisher. On publisher size the update command gets completed in 4 mins while the same takes approx 30 mins to reach on the subscriber side. We have tried customizing the different properties in Agent
Profile like MaxBatchSize, SubsriptionStreams etc but none of this is of any help. I have tried breaking the command and lot of per & comb but no success.
The data that we are dealing with is around 10 millions and our production environment is not able to handle this.
Please help. Thanks in advance!
SamagraHow is the production
publisher server
and subscriber server
configuration ? both are same ? How about the network bandwidth ? have you tried the same task with during working
hours and off hours ? I am thinking problem may be with network as well as both the server configuration..
If you are doing huge operating with replication this is always expected, either you should have that much configuration or you have divided the workload on your publisher server to avoid all these issues. Why can you split the transactions ?
Raju Rasagounder Sr MSSQL DBA -
Hi all,
GoodMorning,.
what exactly does the option of "Allow Bulk Data Load" option on Company Profile page do, it is clear in doc. that it allows crm on demand consultants to load bulk data. But i am not clear on how they load etc etc, do they use anyother tools other than that admin. uses for data uploading.
any real time implementation example using this option would be appreciated.
Regards,
Sreekanth.The Bulk Data Load utility is a utility similar to the Import Utility that On Demand Professional Services can use for import. The Bulk Data Load utility is accessed from a separate URL and once a company has allowed bulk data load then we would be able to use the Bulk Data Load Utility for importing their data.
The Bulk Data Load uses similar method to the Import Utility for importing data with the difference being that the number of records per import is higher and you can queue multiple import jobs. -
Delete bulk data from multiple tables
Hi,
I am having two tables from which data needs to be deleted based on some where clause.
Can anyone help me how to delete bulk data means more than 10000 rows at a time to improve the performance.
Regards,
DineshLPS wrote:
This will be simple delete statement and make sure whether the where clause reffered columns are indexed.Indexing may or may not help. If he is deleting 10000 rows out of 20000 it won't help at all. In fact, indexing may
make things worse as it will slow down the delete. -
URGENT : Return Bulk data from Stored Procedure
Hi,
Tell me, how do I return a bulk of data which
does not exist in the data base
but is concluded while the Stored Procedure is executed
from the Stored procedure
to the C++ program.
For Example:
Table ABC
Field1 Field2 Field3
A 1 3
B 1 5
C 2 10
Table DEF
Field1 Field2 Field3
D 10 24
E 3 16
F 8 19
SP_TESTING
Depending on the values in both the tables
for some range of conditions,
a conclusion X is derived for each range value of the
condition range.
Now I need to return this bulk of data X with the
condition they belong to
back to the C++ code calling it....
NOTE : A stored procedure is requited as there is a lot
of processing
required before we conclude the result X for each value
in the condition range.
If I execute this code from C++ instead of Stored
procedure
it is very slow and speed is a prime requirement of my
system.
Also i'm not using any MFC class to access database.
I'm using ConnectionPtr, RecordsetPtr and _CommandPtr
from msado15.dll for database access...
One solution to this could be use of Temp tables.
As this process is used by a lot of different stored
procedures having a common
temp table to all will need something like 50 NUMERIC
fields, 50 VARCHAR fields
and so on, which doesn't seem like a very good solution
to this problem.
Sounds like something I would have done while in school,
implement a dumb solution.
So, please suggest me a solution as to how do I return
bulk data in the form
of recordsets from stored procedure.
Regards
ShrutiUse Out parameter mode
SQL> CREATE OR REPLACE procedure a1 (x OUT NUMBER, y OUT NUMBER) AS
2 BEGIN
3 x:= 1;
4 y:= 2;
5 END;
6 .
SQL> /
Procedure created.
SQL> SET SERVEROUTPUT ON
SQL> DECLARE
2 a NUMBER :=3;
3 b NUMBER :=4;
4 BEGIN
5 a1 (a,b);
6 DBMS_OUTPUT.PUT_LINE( 'a = ' || a );
7 dbms_output.put_line( 'b = ' || b );
8 END;
9 .
SQL> /
a = 1
b = 2
PL/SQL procedure successfully completed.By default parameters are copied to the OUT parameter mode .
COPY hint in PLSQL don’t send a pointer to calling program unit but NOCOPY
does.
Khurram
Maybe you are looking for
-
Have just connected a viewsonic external monitor to macbook pro 10.6.8, had everything up and running fine for about an hour then the macbook quit. No power (ws running off power source) Have I killed my macbook? Help!
-
i have just installed itunes 7.3.1.3 everytime i start itunes it crashes. below is the error message i get Problem signature: Problem Event Name: APPCRASH Application Name: iTunes.exe Application Version: 7.3.1.3 Application Timestamp: 468d6b77 Fault
-
My Intel based iMac won't play Disney DVDs
We have some of the recently released Disney DVDs (Lion King, Dumbo, etc.) and they will not play on our 27inch iMac. I put the DVDs in, it spins, makes some great noises and then pops out the DVD. DVD player never pops, up, nothing happens. Any i
-
SQL SERVER 2000 / 2005 ( Certificate Version )
How to find out version of certificate in sql server 2000/2005... I am working as a client plz provide me any solution like t'sql script ........... regards ravendraindia
-
Protected Methods in Interfaces
Why don't interfaces allow you to declare protected methods? Is there a way around this? The reason I ask involves a current project of mine. I have an inheritance hierarchy including an abstract Event base class with many concrete derived Events. I