Problem in migrating huge data from mysql to oracle using OMWB.

I'm using OMWB to migrate my mysql db to oracle db.Mysql db contains some 43 tables and all of them were copied to oracle db except a table "as_db" which contains some 503 mb of data.The table contains 3 columns and two of them were integer type and one is of medium blob.While migrating i'm not getting any error.After migrating i opened the table in the oracle.The table alone created but there were no datas.But all other table contains data.
Then i generated the ctl file usiing SQL*loader for that particular table alone and tried to execute but it also fails by giving the Column size exceeds for blob.
How i handle this issue?
Thanks in Advance,
Jai.M

Looks like you may have to save you blobs out as files and load them individually thru the sqlldr process. I know this has normally been done thru raw to blob
What way is your blob data being saved out in the extract data file.
Barry

Similar Messages

  • DB Migration from MYSQL to ORACLE Using Offline Capture

    Hi
    Am doing a database migration from MySQL to Oracle using SQL Developer (version 2.1.1.64). So far, I've successfully captured the MySQL database and converted it to the Oracle Model. However, when generating offline scripts to create the converted model schema into Oracle DDL scripts it managed to generate SQL to create: 1) User 2) Sequences 3) Tables 4) Triggers and 5) constraints.
    It has created the SQL to add the primary key constraints and index constraints. Although it did the foreign key constraints in the SQL, the foreign key constraints seems to have missed the cascading options for the foreign key constraint. I.e. theres no reference of whether the foreign key constraint will restrict on delete or cascade etc.
    We have a foreign keys in the MySql database that have different cascading options and these have not being ported over into the migration SQL. Therefore, all the foreign keys generated in the SQL by default are cascade to restrict on delete.
    Does 'Generate Oracle DDL' not take into account a foreign key's on delete cascading option?
    Any help or information would be greatly appreciated.
    Thanks

    Hello,
    that reminded me for the following thread:
    Migration Microsoft SQL Sever 2005 to Oracle 11g cascade on delete problem
    That is a similar issue, isn't it?
    I opened a bug for that, and it will be fixed in SQL Developer 3.1 (not in any 3.0 Early Adopter version). If you hit the same issue, there is no other way then using the workaround as used in the mentioned thread.
    Regards
    Wolfgang

  • Copy data from mysql to oracle database help?

    Im an sql newb and trying to understand how I can automatically copy data from one dtabase to another systems database on the same server.
    We have two differnt applications, but both share customer information, but one is on a win server, while the other is on oracle.
    Each time a customer contacts us via online chat (win server, mysql), we want to copy their entire chat transcript into our CRM's (oracle) customer account. So the folks that use the CRM can see the past chat histories. I hope this makes sense.
    Where can i look to get started on this?
    Thanks

    You could look at Heterogeneous Services (see the forum here Heterogeneous Connectivity but if you want to push data from mysql to oracle you might be better looking at it from the mssql side. I don't know what they offer.
    It might be simpler to do it at the client end. ie cut and paste from the online chat application into a new utility which inserts into the oracle database.
    Incidentally, this forum is specifically for the SQLDeveloper tool. You might get more general help in the "database - general" or "sql and pl/sql" forums

  • Problem while Migrating user data from 10g to 11gR2

    Hi experts,
    I am trying to Migrate users data(including password and security questions) from 10g to 11gR2 what approach i have followed is..
    From 10g using API i retrieved users data including password and security questions and i stored all information into hashmap. This is one java program.
    And then i am trying to create that user in 11gR2 using API which i retrieved from 10g . From this 11g program i am creating object of 10g and i am using that hash map to retrieve user information.But i am not getting connection to 10g , it is throwing exception like unknown application server.Both sides i used API only as it is recommended to use API instead of JDBC connection.
    Help me in this regard ASAP and suggest if there is any other approach to Migrate users data.
    Thanks in Advance

    By using Trusted Recon, you won't be able to Fetch Password as it is.
    Since your goal is to fetch passwords too, please follow another approach.
    You won't be able to get connection to both 10g and 11g simultaneously in the same program.
    So, break this task in 2 phases. First connect with 10g, fetch user data in CSV format and then connect with 11G and read this CSV to create users.
    Once users are created properly, use APIs for creating challenge questions and answers.
    I think, you are getting exception like unknown application server because you are trying to connect to both 10g and 11g environments simultaneously.
    Follow the following steps:-
    (1) By using 10G APIsyou can't obtain password of user profile in decrypted form. So, Fetch password by using tcDataProvider. It will give you plain text password.
    (2) In a custom scheduler written in 10g, retrieve this data in CSV. After all you can't store this info in
    String query = "SELECT USR_LOGIN, USR_PASSWORD, USR_FIRST_NAME, USR_LAST_NAME FROM USR";//Add all fields which you want to retrieve from your 10G
    (3) Use this query, tcDataProvider, tcDataSet and Java I/O (or any other CSV Third Party tool like the ones obtained in csv.jar in XL_HOME/ext folder) fetch this info in a CSV.
    (4) Once CSV is generated, 10g machine is no more needed. Connect with 11g using 11g APIs. Write your custom 11G scheduler in order to read this CSV and use 11g APIs and create users for each record.
    (5) Once user records are created in 11g, the difficult part is done. Transfer the Security questions too by using this CSV technique.
    Please share results with us.

  • How to load data from sysbase to oracle using sysbase odbc drivers??

    Hi ,
    I am trying to create an interface from sysbase to oracle, using LKM sql to orcle and IKM SQL control append, IKM SQL incremental append,CKM oracle . when we are running the interface we are getting below error
    com.sunopsis.sql.SnpsMissingParametersException: Missing parameter: C1_WORKGROUP_ID
    SQL: insert into ODIUSER.C$_0AGENTLOGINLOGOUT (      C1_WORKGROUP_ID,      C2_USER_ID,      C3_CREATEDT,      C4_UPDATEDT,      C5_REASONID,      C6_LOGOUTDT,      C7_LOGINDT,      C8_SERVICE_ID ) values (      :C1_WORKGROUP_ID,      :C2_USER_ID,      :C3_CREATEDT,      :C4_UPDATEDT,      :C5_REASONID,      :C6_LOGOUTDT,      :C7_LOGINDT,      :C8_SERVICE_ID )
         at com.sunopsis.sql.SnpsQuery.completeHostVariable(SnpsQuery.java:439)
         at com.sunopsis.sql.SnpsQuery.updateExecStatement(SnpsQuery.java:1926)
         at com.sunopsis.sql.SnpsQuery.addBatch(SnpsQuery.java:122)
         at com.sunopsis.dwg.dbobj.SnpSessTaskSql.executeUpdate(SnpSessTaskSql.java:3034)
         at com.sunopsis.dwg.dbobj.SnpSessTaskSql.execCollOrders(SnpSessTaskSql.java:729)
         at com.sunopsis.dwg.dbobj.SnpSessTaskSql.treatTaskTrt(SnpSessTaskSql.java:2815)
         at com.sunopsis.dwg.dbobj.SnpSessTaskSql.treatTask(SnpSessTaskSql.java:2515)
         at com.sunopsis.dwg.dbobj.SnpSessStep.treatAttachedTasks(SnpSessStep.java:534)
         at com.sunopsis.dwg.dbobj.SnpSessStep.treatSessStep(SnpSessStep.java:449)
         at com.sunopsis.dwg.dbobj.SnpSession.treatSession(SnpSession.java:1954)
         at oracle.odi.runtime.agent.processor.impl.StartSessRequestProcessor$2.doAction(StartSessRequestProcessor.java:322)
         at oracle.odi.core.persistence.dwgobject.DwgObjectTemplate.execute(DwgObjectTemplate.java:224)
         at oracle.odi.runtime.agent.processor.impl.StartSessRequestProcessor.doProcessStartSessTask(StartSessRequestProcessor.java:246)
         at oracle.odi.runtime.agent.processor.impl.StartSessRequestProcessor.access$0(StartSessRequestProcessor.java:237)
         at oracle.odi.runtime.agent.processor.impl.StartSessRequestProcessor$StartSessTask.doExecute(StartSessRequestProcessor.java:794)
         at oracle.odi.runtime.agent.processor.task.AgentTask.execute(AgentTask.java:114)
         at oracle.odi.runtime.agent.support.DefaultAgentTaskExecutor$2.run(DefaultAgentTaskExecutor.java:82)
         at java.lang.Thread.run(Thread.java:619)

    Hi all,
    Can any one help me in doing interface from sysbase to oracle??

  • Extract data from ECC to Oracle using Data Services 4.0

    How to extract data from ecc6.0 Business content extractors  to oracle using sap bo data services 4.0

    Are you trying to use the SAP BW Business Content to extract data out of ECC and load into Oracle tables with Data Services? If that's the case, then you cannot do that. The SAP BW Business Content was developed to only be used in conjunction with SAP BW. When using Data Services to access the extractors in ECC, it has to have an SAP BW InfoPackage associated with it to execute. In this architecture, Data Services is only a pass through from ECC to BW and allows the ability to do some transformations of data prior to loading into the EDW layer (staging tables basically) on SAP BW.
    To connect ECC to Oracle, you're going to have to have all of the SAP BusinessObjects supplied Function Modules loaded onto ECC, along with a non-dialog logon account that has the ability to pass dynamic ABAP programs, generate the programs and schedule them. Depending on how you want to process the output, you may also have to have the ability to write to files on the ECC application servers and have an FTP account created on the application servers that can GET flat files and potentially DELETE them (you're going to need to delete periodically, otherwise your jobs will crash when the file space allocation has been consumed).

  • Is it possible to migrate a mediumblob from mysql to oracle.?

    I'm having a medium blob type in mysql which contains an image name say "one.gif".But after executing the dump_extract.bat file and viewing the data in the txt file ,One.gif is appened with lot of special chracters.Due to this i'm getting" the column exceeds maximum length" error while executing thru the SQL_Loader.
    Thanks in Advance,
    Jai

    Yes.  Locate the file named Domain.sites that is located in your Home/Library/Application Support/iWeb folder and copy it to the same folder on the new drive.  Launch iWeb and it will locate and open the file.
    If there is not iWeb folder in the Applicaitons Support folder create one.
    NOTE 1: In Lion and Mountain Lion the Home/Library folder is now invisible. To make it permanently visible enter the following in the Terminal application window: chflags nohidden ~/Library and press the Return key - 10.7: Un-hide the User Library folder.
    Note 2:  iWeb 08 is not compatible with Mt. Lion and will experiencing problems publishing to a folder and crashing. iWeb 3 (09) is compatible with Mt. Lion.
    OT

  • Data from MySQL to Oracle

    Hi All,
    Can anyone please help, how can i establish database link from oracle to mysql via pl/sql developer or sth else?
    thank you
    Ugur

    yes, It is possible if your Oracle database running in Windows.
    You can do this by creating the ODBC connection.
    Install ODBC driver 3.51 on Oracle box
    Create userid on MySQL
    Setup ODBC connection on Oracle box and test until OK
    Check Listener, tnsnames and init.ora are setup as per documentation
    Restart listener
    Test tnsping until OK
    Create public MySQL_DBLINK DBLink on Oracle (UserId and Password are case sensitive!!!)
    Test "select count(*) from anytable@MySQL_DBLINK;"
    Check this link to create ODBC connection,
    http://www.dba-oracle.com/t_database_link_sql_server_oracle.htm
    http://www.dba-oracle.com/t_heterogeneous_database_connections_sql_server.htm
    Thanks

  • How to handle DATE type problems in migrating from mysql to oracle.?

    Hi,
    I'm migrating only the data from mysql to oracle with the help of sql loader.But with this type i cannot able to insert the date values from mysql to oracle.In mysql i have defined date as "DATETIME" type and in oracle it is in TimeStamp.Whenever i'm inserting the values thru CTL file ,i'm getting an error invalid date format entered.How to solve this problem?
    Thanks in Advance
    JAI

    you need to supply a mask to the timestamp entry. see http://download-west.oracle.com/docs/cd/B19306_01/server.102/b14215/ldr_field_list.htm#i1006714 for details on datatypes within the ctl file
    B

  • Migrating from MySQL to Oracle 11g

    can u tell me the steps to migrating database from MySQL to Oracle using SQL Developer?
    I have installed Oracle 11g(11.1.0.6.0) in server and SQl Developer in my local system.
    I connected to MySQL and Oracle in SQL Developer and tried to MIgrate everything in MySQL to Oracle using Quick Migrate option.
    it took one and half day but didn't work at last...
    is there any pre tasks that i need to do before migrating to oracle from MySQL?
    and also when i export the MySQL data to Oracle using SQL Developer,it is giving data type mismatch errors like (date, boolean etc...).
    please give a reply what i need to do and any refferences also very much appreciated.

    hi Turloch ,
    I followed the steps mentioned above.
    When I right click my MySQL server db Connection and click "Capture MySQL server", it just shows a dialog with "Close" enabled.
    after clicking the close button, it gives following error messages.
    oracle.dbtools.metadata.persistence.PersistableObject.doInsert(PersistableObject.java:238)
    the same error is coming no of times.
    Kindly help me in this regard.

  • Best practice for loading from mysql into oracle?

    Hi!
    We're planning migrating our software from mysql to oracle. Therefore we need a migration path for moving the customer's data from mysql to oracle. The installation and the data migration/transfer have to run onto different customer's enviroments. So migration ways like installing the oracle gateway and connect for example via ODBC to mysql are no option because the installation process gets more complicated... Also the installation with preconfigured oracle database has to fit on a 4,6 GB dvd...
    I would prefer the following:
    - spool mysql table data into flat files
    - create oracle external tables on the flat files
    - load data with insert into from external tables
    Are there other "easy" ways of doing migrations or what do you think about the prefered way above?
    Thanks
    Markus

    Hi!
    Didn't anyone have this requirement for migrations? I have tested with the mysql select into file clause. Seems to work for simple data types - we're now testing with blobs...
    Markus

  • Transfer Of Data from Sap to Oracle with the help of Enterprise Services.

    Hello,
    We want to transfer data from Sap to Oracle using standard Enterprise Services.Some fields were not available in the existing standard Enterprise Services,so we have enhanced the existing Services by writing code inside BADI available with Enterprise Services.Rest of the fields we have mapped with the existing fields available in standard Enterprise Services.But,the Oracle people want to fetch all data from Sap without entering any input as a mandatory field in the Enterprise Services.The existing standard Enterprise Services require to enter any field as mandatory and are not accepting the range in input for multiple records.e.g.All enterprise Services related to Sales Orders are displaying only one sales Order.We have searched all Enterprise Services for Sales Order(related to reading of data),but not able to find service which would display mutiple records without entering any input.ECC_SALESORDER009QR is the only service which is displaying multiple records without entering any input,but the required fields are not available in this service.So,kindly suggest what we need to do further.
    1.Should we go for customization of services completely,so that it would fulfil our requirement.
    2.Are there  standard Enterprise Services exists which would we give us data in range(all records).
    If they exists,please specify the names of Services for reading Purchase Order,Production Order,BOM etc.
    Thanks & Regards,
    Divya.

    Hi Vaibhav,
    Let me tell you the objective in detail.
    Objective.
    To develop a package solution which will work as a bridge between Oracle APS and SAP system so that customers using SAP will be able to use advantages of Oracle APS for their planning needs.
    This will consist of following major components:
    OA Templates is an Oracle utility to load data from any legacy system to Oracle APS using standard flat files.
    Oracle has developed an Application Integration Architecture which is a standard architecture used for integration of Oracle products with other systems.
    Enterprise services is an SAP utility to communicate with SAP.
    AIA canonicals are standard canonicals developed by Oracle where we have to map data fields from destination system (Oracle APS) and source system (SAP)
    Fusion middleware is being used to develop application interfaces following AIA standards.
    Tasks at stake:
    Mapping of Oracle APS fields and SAP Enterprise Service fields to AIA canonicals
    Technical work of developing middleware using Oracle Fusion
    From Sap side,we have to map fields which we have received from Oracle with the help of Enterprise Services,rest  consumption of these services is done by Oracle guys.So,suggest is there enterprise services available which would give us multiple records .
    Thanks & Regards,
    Divya.

  • Migrating Data from MySQL to SQL Server 2012

    Hi all,
    I'm Migrating a database from MySQL to SQL Server 2012, using SSMA for MySQL v5.2.1258.  I've got the schema migrated over and have resolved any migration issues (stored procedures / views), but when it comes to migrating over the data I'm just hitting
    a wall.
    None of the data is migrating and when the migration report is displayed every table has a red x against it's Status.  The Output box has the following:
    Data migration operation has finished.
    0 table(s) successfully migrated. 
    0 table(s) partially migrated. 
    64 table(s) failed to migrate.
    I've seen on the forum that someone else was having the same problem
    (http://social.msdn.microsoft.com/Forums/en-US/sqlservermigration/thread/b835f4b3-3d93-42a4-9b6b-d21d3dfd8dab/)
    I've set the project settings mode to default, still getting the same error and tried using both Client Side Data Migration and Server Side migration, with both giving the same result.  I've tried going through the step-by-step blog as well.
    I am doing something really stupid?  There are 64 tables, so don't really want to try doing exports from each table and importing it into the new database.
    Hope someone can help.
    Cheers
    Alex

    Hello,
    I don’t have suggestions for you, but you can try contacting the SQL Server Migration Assistant (SSMA) Team via e-mail ([email protected])
    to see if they can provide a solution on this scenario.
    Hope this helps.
    Regards,
    Alberto Morillo
    SQLCoffee.com

  • Problem while opening data from mysql  in Excel

    Hi friends
    when i try to download data from MySQL to excel sheet I am getting error of
    This file is not in a recognizable format.
    . If you know the file is from another program which is incompatible with Microsoft Office Excel,click Cancel then open this file in its original application.If you want to open the file later in Microsoft Office Excel,save it in a format that is compatible,such as text format.
    etc
    Code is
    Class.forName("org.gjt.mm.mysql.Driver");
    Connection conn = DriverManager.getConnection("xxxxxx");
    Statement st = conn.createStatement();
    StringBuffer sb = new StringBuffer();
    sb.append("SAP#" + "\t");
    sb.append("x-plant status" + "\t");
    sb.append("Total Amount" +"\t");
    sb.append(">90 days" + "\t");
    sb.append("\n");
    try
    String query="select * from temp_Xplant";
    ResultSet rs = st.executeQuery(query);
    while(rs.next()){
    sb.append(rs.getString("sapNo") + "\t");
    sb.append(rs.getString("status")+ "\t");
    sb.append(rs.getString("amt") + "\t");
    sb.append(rs.getString("days") + "\t");
    sb.append("\n");
    conn.close();
    st.close();
    catch (Exception e)
    out.println("error");
    conn.close();
    st.close();
    response.setContentType("application/vnd.ms-excel");
    response.setHeader("Content-Disposition", "attachment; filename=\"test.XLS\"");
    I can open them after clicking OK, but before that it is showing the above error message
    Thank you
    Edited by: priyap on Apr 3, 2008 4:54 AM

    hi,
    there are some specific restrictions in downloading data in alv.
    i.e you cannot have morethan 1024 characters that can be downloaded in single row.
    so change your report columns accordingly.

  • Out of memory error in migrating from mysql to oracle

    Hi,
    I'm trying migrate from mysql to oracle9i. The hardware and software requirements are matching or even much higher than mentioned in the documentation. I have installed OMWB is properly(hope so). When I try to migrate the desired database from mysql to oracle, the repository, the Oracle model are created successfully. But while migrating data, say for around 35000 rows insertion everything works pretty fine. But after sometime, the migration stops and no error or anything appears in the screen.And in the Errorlog I get "OutOfMemoryError".
    I tried to migrate single table also. But I still get this problem after sometime. The tables may contain from 10 - 130000 rows...
    I even tried to change the %JRE..% memory value to 64M or 128M in the omwb.bat file and the O/s is Microsoft Windows XP, RAM 256MB.
    Any help is much appreciated.
    Thanks in advance.

    Hi Viji,
    did the message in the error.log provide any extra information? For example, was the OutOfMemory error prefixed with a string
    like "cannot create new native thread" ? My suggestion to workaround the problem would have been to increase the JVM heap size to 128
    but this appears to have been unsuccessful for you. Can you send a mail to the Migration Workbench support services
    ([email protected]) explaining the problem, and attach the full error.log file please ? (please also copy me on the mail).
    In the meantime, you can workaround this by migrating the schema only (there is a switch on step 3 of the migration wizard asking:
    "Do you want to migrate the table data to Oracle?" - just select the "No" option). When the schema has migrated, you can then use the
    offline data loading facility to migrate the data. This uses data extraction scripts and SDQL*Loader to migrate the data to the target
    Oracle database.You can learn more about the offline data loading facility from the plugins referenceguide (from the Help menu in
    the Migration Workbench).
    I hope this helps,
    Tom.

Maybe you are looking for