Bulk data loading

In order to load data into a 10.2 database from another Oracle database (8.1), what are the options available? Sqlldr , create table as select * from ) , export & import
Any pros vs cons ?

Troll35 wrote:
Hello,
I've created db link between 10g and 8i without any problem, the problem is between 11g an 8i.It's very release level specific and there could be unknown problems in non-supported links between new and old versions.
Typically it's best to consider that database links are only supported back to the previous version of an oracle database i.e. 10g to 9i or 11g to 10g, but any wider span that that may have issues.
The matrix is quite complex for it, and I think you can only get the official one through oracle support (metalink as it was), though it seems there's a version on the following website...
http://www.myoraclesupports.com/content/client-server-interoperability-support-between-different-oracle-versions

Similar Messages

  • Reg "Allow Bulk Data Load"

    Hi all,
    GoodMorning,.
    what exactly does the option of "Allow Bulk Data Load" option on Company Profile page do, it is clear in doc. that it allows crm on demand consultants to load bulk data. But i am not clear on how they load etc etc, do they use anyother tools other than that admin. uses for data uploading.
    any real time implementation example using this option would be appreciated.
    Regards,
    Sreekanth.

    The Bulk Data Load utility is a utility similar to the Import Utility that On Demand Professional Services can use for import. The Bulk Data Load utility is accessed from a separate URL and once a company has allowed bulk data load then we would be able to use the Bulk Data Load Utility for importing their data.
    The Bulk Data Load uses similar method to the Import Utility for importing data with the difference being that the number of records per import is higher and you can queue multiple import jobs.

  • How to improve performance for bulk data load in Dynamics CRM 2013 Online

    Hi all,
    We need to bulk update (or create) contacts into Dynamics CRM 2013 online every night due to data updated from another external data source.  The data size is around 100,000 and the data loading duration was around 6 hours.
    We are already using ExecuteMultiple web services to handle the integration, however, the 6 hours integraton duration is still not acceptable and we are seeking for any advise for further improvement. 
    Any help is highly appreciated.  Many thanks.
    Gary

    I think Andrii's referring to running multiple threads in parallel (see
    http://www.mscrmuk.blogspot.co.uk/2012/02/data-migration-performance-to-crm.html - it's a bit dated, but should still be relevant).
    Microsoft do have some throttling limits applied in Crm Online, and it is worth contacting them to see if you can get those raised.
    100 000 records per night seems a large number. Are all these records new or updated records, or are there some that are unchanged, in which case you could filter them out before uploading ? Or are there useful ways to summarise the data before loading
    Microsoft CRM MVP - http://mscrmuk.blogspot.com/ http://www.excitation.co.uk

  • Is HCM Data Loader functional on fusion HCM release 8? or will it be on release 9?

    I understand that the long-term plan for data loading is to provide a single tool, which is referred to as HCM Data Loader.
    HCM Data Loader is intended to provide the standard data-import solution and a single entry point for managing bulk data loading to Oracle Fusion HCM.
    What is the task to use this tool? (document with explanations)
    Also can someone explain (in short) when to use each one the functions currently delivered by HCM File-Based Loader, HCM Spreadsheet Data Loader,
    and some of the specialized data loaders and the tasks needed to get to this tools.

    I have no knowledge of HCM feature availability, however this seems duplicate of this support forum thread. Adding the link here in case someone else is looking for the same information.
    Jani Rautiainen
    Fusion Applications Developer Relations
    https://blogs.oracle.com/fadevrel/

  • 4.2.3/.4 Data load wizard - slow when loading large files

    Hi,
    I am using the data load wizard to load csv files into an existing table. It works fine with small files up to a few thousand rows. When loading 20k rows or more the loading process becomes very slow. The table has a single numeric column for primary key.
    The primary key is declared at "shared components" -> logic -> "data load tables" and is recognized as "pk(number)" with "case sensitve" set to "No".
    While loading data, these configuration leads to the execution of the following query for each row:
    select 1 from "KLAUS"."PD_IF_CSV_ROW" where upper("PK") = upper(:uk_1)
    which can be found in the v$sql view while loading.
    It makes the loading process slow, because of the upper function no index can be used.
    It seems that the setting of "case sensitive" is not evaluated.
    Dropping the numeric index for the primary key and using a function based index does not help.
    Explain plan shows an implicit "to_char" conversion:
    UPPER(TO_CHAR(PK)=UPPER(:UK_1)
    This is missing in the query but maybe it is necessary for the function based index to work.
    Please provide a solution or workaround for the data load wizard to work with large files in an acceptable amount of time.
    Best regards
    Klaus

    Nevertheless, a bulk loading process is what I really like to have as part of the wizard.
    If all of the CSV files are identical:
    use the Excel2Collection plugin ( - Process Type Plugin - EXCEL2COLLECTIONS )
    create a VIEW on the collection (makes it easier elsewhere)
    create a procedure (in a Package) to bulk process it.
    The most important thing is to have, somewhere in the Package (ie your code that is not part of APEX), information that clearly states which columns in the Collection map to which columns in the table, view, and the variables (APEX_APPLICATION.g_fxx()) used for Tabular Forms.
    MK

  • Insert OR Update with Data Loader?

    Hello,
    can i Insert OR Update at same time with Data Loader?
    How can i do this?
    Thanks.

    The GUI loader wizard does allow for this including automatically adding values to the PICKLIST fields.
    However, if you mean the command line bulk loader, the answer is no. And to compound the problem, the command line version will actually create duplicates for some of the objects. It appears that the "External Unique Id" is not really "unique" (as in constrained via unique index) for some of the objects. So be very careful when you prototype something with the GUI loader and then reuse the map on the command line version.
    You will find that some objects can work well with the command line loader (some objects will not).
    Works well (just a few examples):
    Account (assuming your NAME,LOCATION fields are unique).
    Financial Product
    Financial Account
    Financial Transaction
    Will definitely create duplicates via command line bulk loader:
    Contact
    Asset
    Also be aware that you might hear that during a go-live that Oracle will remove the 30k record limit on bulks loads (temporarily). I have not had any luck with Oracle Support making that change (2 clients specifically in the last 12 months).

  • Optimization for bulk data upload

    Hi everyone!
    I've got the following issue:
    I have to do a bulk data upload using JMS deploy in a glassfish 2.1, to process and validate data in a Oracle 10g DB System before it is insert.
    I have my web interface that loads a file and then delegates the process to a Stateless Session Bean which read a N number of lines and after that send a message to a JMS Queue. The JMS has to parse the each line, validate them with the data already in the DB, and finally persist the new data.
    This process is high process consuming, and I need to improve the performance time. I tried to change glassfish default JMS and JDBC pool size, but I have no a huge difference.
    Do you have any advice that could help me?
    Thanks in advance!

    Hi! thank you for you answer!
    High process consuming is in the MDB
    I'm grouping each N number of read lines in the EJB and then send the message to the JMS. The MDB process the persists each line as info in different related tables.
    Thanks again!

  • Statistic on throughput of data loader utility

    Hi All
    Can you guys share some statistics on throughput of data loader utility ? If you are looking for number of records you may consider 1 Million, how long it would take to import this ?.
    I need these number to make a call on using Web Service or Data loader utility. Any suggestion is appreciated.
    Thank you.

    It really depends on the object and the amount of data in there (both the number of fields you are mapping, and how much data is in the table).
    For example…
    One of my clients has over 1.2M Accounts. It takes about 3 hours (multi-tenant) to INSERT 28k new customers. But when we were first doing it, it was sub-1hour. Because the bulk loader is limited on the record count (most objects are limited to 30k records in the input file), you will need to break up your file accordingly.
    But strangely, the “Financial Account” object (not normally exposed in the standard CRMOD), we can insert 30k records in about 30 min (and there are over 1M rows in that table). Part of this is probably due to the number of fields on the account and the address itself (remember it is a separate table in the underlying DB, even though it looks like there are two address sets of fields on the account).
    The bulk loader and the wizard are roughly the same. However, the command line approach doesn’t allow for simultaneously INSERT/UPDATE (there are little tricks around this; depends how you might prepare the extract files from your other system... UPDATE file and a INSERT file, some systems aren't able to extract this due to the way they are built).
    Some objects you should be very careful with because the way the indexes are built. For example, ASSET and CONTACT both will create duplicates even when you have an “External Unique Id”. For those, we use web services. You aren’t limited to a file size there. I think (same client) we have over 800k ASSETS and 1.5M CONTACTS.
    The ASSET load (via webservice which does both INSERT and UPDATE) typically can insert about 40k records in about 6 hours.
    The CONTACT load (via webservice which does both INSERT and UPDATE) typically can insert about 40k records in about 10 hours.
    Your best shot is to do some timings via the import wizard and do a little linear time increase as you increase the data size sitting in the tables.
    My company (Hitachi Consulting) can help build these things (both automated bulk loaders and web services) if you are interested due to limited resource bandwidth or other factors.

  • Announcing 3 new Data Loader resources

    There are three new Data Loader resources available to customers and partners.
    •     Command Line Basics for Oracle Data Loader On Demand (for Windows) - This two-page guide (PDF) shows command line functions specifc to Data Loader.
    •     Writing a Properties File to Import Accounts - This 6-minute Webinar shows you how to write a properties file to import accounts using the Data Loader client. You'll also learn how to use the properties file to store parameters, and to use the command line to reference the properties file, thereby creating a reusable library of files to import or overwrite numerous record types.
    •     Writing a Batch File to Schedule a Contact Import - This 7-minute Webinar shows you how to write a batch file to schedule a contact import using the Data Loader client. You'll also learn how to reference the properties file.
    You can find these on the Data Import Resources page, on the Training and Support Center.
    •     Click the Learn More tab> Popular Resources> What's New> Data Import Resources
    or
    •     Simply search for "data import resources".
    You can also find the Data Import Resources page on My Oracle Support (ID 1085694.1).

    Unfortunately, I don't believe that approach will work.
    We use a similar mechanism for some loads (using the bulk loader instead of web services) for the objects that have a large qty of daily records).
    There is a technique (though messy) that works fine. Since Oracle does not allow the "queueing up" of objects of the same type (you have to wait for "account" to finish before you load the next "account" file), you can monitor the .LOG file to get the SBL 0363 error (which means you can submit another file yet (typically meaning one already exists).
    By monitoring for this error code in the log, you can sleep your process, then try again in a preset amount of time.
    We use this allow for an UPDATE, followed by an INSERT on the account... and then a similar technique so "dependent" objects have to wait for the prime object to finish processing.
    PS... Normal windows .BAT scripts aren't sophisticated enough to handle this. I would recommend either Windows POWERSHELL or C/Korn/Borne shell scripts in Unix.
    I hope that helps some.

  • Oracle Database Table data Load it into Excel

    Hello All,
    Please I need your help for this problem:
    I need to load Oracle database Table data and load it into Excel and saved with xls format.
    Example -Select * from Slase data load it into the Excel.
    I appreciate ans sample code to help me do that, Please help me out. This is very urgent.
    Thanks alot and best regards,
    anbu

    >
    I need to load Oracle database Table data and load it into Excel and saved with xls format.
    Example -Select * from Slase data load it into the Excel.
    I appreciate ans sample code to help me do that, Please help me out. This is very urgent.
    >
    Nothing in these forums is 'urgent'. If you have an urgent problem you should contact Oracle support or hire a consultant.
    You have proven over and over again that you are not a good steward of the forums. You continue to post questions that you say are 'urgent' but rarely take the time to mark your questions ANSWERED when they have been.
    Total Questions: 90 (78 unresolved)
    Are you willing to make a commitment to to revisit your 78 unresolved questions and mark them ANSWERED if they have been?
    The easiest way to export Oracle data to Excel is to use sql developer. It is a free download and this article by Jeff Smith shows how easy it is
    http://www.thatjeffsmith.com/archive/2012/09/oracle-sql-developer-v3-2-1-now-available/
    >
    And One Last Thing
    Speaking of export, sometimes I want to send data to Excel. And sometimes I want to send multiple objects to Excel – to a single Excel file that is. In version 3.2.1 you can now do that. Let’s export the bulk of the HR schema to Excel, with each table going to it’s own workbook in the same worksheet.
    >
    And you have previously been ask to read the FAQ at the top of the thread list. If you had done that you would have seen that there is a FAQ for links that have many ways, with code, to export data to Excel.
    5. How do I read or write an Excel file?
    SQL and PL/SQL FAQ

  • Bulk Data Upload

    Hi
    We have a requirement to load bulk data which would be a full dump (and not incremental) in CSV format almost every week from other applications.
    This implies that I can drop my tables and rebuild the same using the CSV files that I have received.
    I was just wondering is there is any real efficient tool or utility in ORacle (or outside) to import huge amount of data (apart from SQL Loader, Ext Tables and Data Pump)
    Regards
    Kapil

    I don't know of any tool apart from loader/Ext-table and Datapump.
    You may find tools which you can buy (and claim they are really good).
    Honestly, if you want to load flat file data (gigabytes or kilobytes) into Oracle, there is nothing better than SQL*loader, "if you use all its capabilities" (External tables and loader are same thing, just the wrapper is different).
    Cheers

  • Limitations with the free Informatica Cloud Data Loader

    Hello,Can you please help me understand that limitations of the free data loader? In this link  - http://www.informaticacloud.com/editions-integration.html# - I see the below features listed.No-code, wizard driven cloud integrationMulti-tenant SaaS solutionDatabase and file ConnectivityFlexible schedulingBulk API support (for Salesforce.com)Unlimited rows/day24 jobs/day1 Secure AgentLimited to 1 userCommunity supportCloud Data MaskingQuestions:When I view licenses in my free data loader, under Feature Licences, it shows the License type for Salesforce Connectivity/Bulk API as “Trial”. Can’t I create a scheduled Data Synch task to upsert records in Salesforce using Bulk API mode?Is the email notification option (for success, warning and failure of data synch task) available on the free version (and not as a trial)?I understand there is a limit of 24 jobs/day. But is there a limit on the number of scheduled data synch tasks that can be created?Data Masking is listed as a feature above for the free edition. However, when I view the licenses in my free data loader, Data Masking is shown as “Trial”. Can you please clarify this?Is there a limit on the number of Connections that can be created?ThanksSanjay

    Hi, The present project has the requirement to Delete the data from Sales Force objects.Have following set up: 1. Parent Objects2. Child Objects3. Cloud Data Synchronization tasks to delete these objects Parent and Child have LOOKUP relationships between them.Deleing data from Child objects did not give any error. Tried 2 scenarios to delete data from Parent object: Scenario 1: Tried to delete to data from PARENT first before deleting CHILD.                  Result: Failed Scenario 2: Tried to delete to data from PARENT after deleting CHILD.                  Result: Failed Error mesge received in both cases: "Error loading into target [SF_Object] : Error received from salesforce.com. Fields []. Status code [DUPLICATE_COMM_NICKNAME]. Message [Too many records to cascade delete or set null]." Kindly help to resolve this error and suggest a method to delete data from PARENT salesforce objects. Please feel free to ask for more inputs, if required.

  • Data Loader API Calls x Webservices

    Hi,
    I'm trying to use the Data Loader calls by the webservices.
    I've downloaded the OracleDataLoaderOnDemandImportServices.wsdl and I've been tested it in a C# project, but I don't know to use exactly the methods BulkOpImportCreateRequest and BulkOpImportSendData.
    I can't create a import bulk request...
    Has someone already used this before?
    Help me, please?
    Best regards,
    Herbert

    TheSilverHammer wrote:
    Thats some nice info, but what if it is not a "framework"? These are basic Darwin API calls, not Apple Frameworks.
    Last sentence on the page:
    "If you need to link to a library instead of a framework, you can use the -weak_library linker command instead of -weak_framework."
    TheSilverHammer wrote:
    I did not define printf so I can't weak link it.
    This doesn't make any sense. You didn't define any of the symbols in any of the frameworks or libraries Apple provides--that does not inhibit your ability to link against them, weakly or otherwise.

  • ODI data loading Speed

    Hi All,
    ODI data loading step's speed is too low.
    I am using;
    LKM=LKM SQL to Oracle,
    CKM=CKM SQL,
    IKM=IKM SQL Incremental Update for replication SQL Server to Oracle.
    I don't use Flow control in interfaces.
    SQL Server and Oracle database are installed on the same server.
    How can I do it faster?

    If the two database servers are on the same machine, and you are dealing with bulk data, you should use an LKM which uses bulk methods (BCP to extract the data from SQL Server and External tables to get the data into Oracle) - something like this KM https://s3.amazonaws.com/Ora/KM_MSSQL2ORACLE.zip (which is actually an IKM but the LKM is not so different.
    Hope it help

  • Bulk data transfer

    Hi,
    I want to transfer bulk data from SAP R/3 to a third party systems.
    Is using SAP XI the correct solution?
    Will SAP XI be able to handle millions of data?
    Since the data is persisted in SAP XI will not that be a constraint?
    What will be the best solution to transfer bulk data from SAP R/3?
    Thanks & Regards
    Monzy

    Hi Monzy,
    we have also faced this situation where we have to think
    about using XI for mass data volume loading into and from
    R/3.
    First you have to know if your interface is a quasi permernant one or just one time interface. We do all one
    time interface with abap program, ALE, idoc or LSMW and
    not through XI.
    If your interface is permantly than the answer should be
    yes for XI. If your interface is is not permanantly but
    it would run once in a month or quarterly than the answer
    should be yes to XI also.
    If using XI with mass data you have to think aboud
    archirving or deleting XML-Message regurlarly from the
    XI-System (IS and AFW separately). We use now the deletion
    option in our productiv system. We delelete every day a
    little bit xml-message that lie back since 30 day.
    Synchronous message are not persisted, so no worry about
    it. Some errornous message have to be archive and can not
    be deleted from the system.
    Hope this will help you in your Decision.
    regards,
    Ly-Na Phu

Maybe you are looking for

  • Trouble with outline text in photoshop - Ctrl + shift + O

    Hello, I am having miscommunication with a client, who is a chinese manufacturer printing plastic cards for me. The manufacturer is requesting I "outline" my text and send over as layers in photoshop, and suggests using "ctrl+shift+o" to solve this i

  • 11x17 tabloid does not print correctly

    I am using an HP Photosmart printer model 8550, my previous printer was an HP Photosmart 8750. That printer died and the 8550 is the newer model of the same printer. When trying to print an 11X17 document in tabloid, the page does not print correctly

  • My iPhone 5 isn't syncing and my iTunes isn't working!

    Hi guys, I've been spending hours and hours trying to rectify this iTunes problem I've been having. Basically whenever I open iTunes 11.0.5.5 on my Windows 7 PC an error box pops up saying 'A required iTunes component is not installed. Please reinsta

  • How to craete tabs with dashboard elements

    Hi, Please can anyone suggest how to crate tabs with user created dashboards? If I have created some dashbords and want it as a tab,how to do that? Thanks Arpita

  • Reg:Allocation on previous year actuals

    Hi Experts,                   I have an issue in allocation, i wrote allocation logic to allocate profit center by previous year actuals that was ok but if there are new profit center created for this year and there wont be any previous year actuals