Offline data capture for SQLServer2k shows 0 tables

Hi,
I'm evaluating OMWB Offline Data Capture for SQL Server 2000 migration.
I ran OMWB_OFFLINE_CAPTURE.BAT script for SQL Server 2000. It seams like the DAT files are generated and no error messages appear. The problems occure when I start the OMW and try to "Capture Source Database".
I specify the directory, where the generated DAT files reside, the DAT files appear in the file list and the status is AVAILABLE. But when I run the Capture with the Oracle Model Creation, I see among the LOG messages that "...Tables Mapped: 0...".
I created TEST database with the table [tab] in it. In generated SS2K_SYSOBJECTS.dat file there is a row for this table:
tab     \t     2041058307     \t     U      \t     1     \t     1     \t     1610612736     \t     0     \t     0     \t     0     \t     2006/09/26      \t     0     \t     0     \t     0     \t     U      \t     1     \t     67     \t     0     \t     2006/09/26      \t     0     \t     0     \t     0     \t     0     \t     0     \t     0     \t     0     \t     \r\n
The rest objects are not in the Oracle Model too (I believe the user sa must have been created too).
Please, anybody help with this problem.
Pavel Leonov, Consultant
Ispirer Systems Ltd.
SQLWays - Data, schema, procedures, triggers conversion to Oracle, DB2, SQL Server, PostgreSQL, MySQL
http://www.ispirer.com

I changed the separators back to the default. But the Oracle Model still is not created. Still the same problem, there are no tables at all in the source database.
Here is how the row for the table is specified in the SS2K_SYSOBJECTS.dat file:
tab     ?     2041058307     ?     U      ?     1     ?     1     ?     1610612736     ?     0     ?     0     ?     0     ?     2006/09/26      ?     0     ?     0     ?     0     ?     U      ?     1     ?     67     ?     0     ?     2006/09/26      ?     0     ?     0     ?     0     ?     0     ?     0     ?     0     ?     0     ?     ?
Here is some information from the log:
Type: Information
Time: 26-09-2006 15:13:56
Phase: Capturing
Message: Row delimiter being used for offline capture is ¤
Type: Information
Time: 26-09-2006 15:13:56
Phase: Capturing
Message: Column delimiter being used for offline capture is §
Type: Information
Time: 26-09-2006 15:13:57
Phase: Capturing
Message: Generating Offline Source Model Load Formatted File For SS2K_SYSOBJECTS.dat ,File Size: 5235
Type: Information
Time: 26-09-2006 15:13:57
Phase: Capturing
Message: Generated Offline Source Model Load File d:\DBClients\oracle\ora92\Omwb\offline_capture\SQLServer2K\itest\SS2K_SYSOBJECTS.XML
Type: Information
Time: 26-09-2006 15:14:27
Phase: Creating
Message: Mapping Tables
Type: Information
Time: 26-09-2006 15:14:27
Phase: Creating
Message: Mapped Tables
Type: Summary
Time: 26-09-2006 15:14:27
Phase: Creating
Message: Tables Mapped: 0, Tables NOT Mapped: 0
By the way. After I try to create the Oracle Model, for each of the DAT files the XML file is created with the following content:
<?xml version="1.0" encoding="Cp1251" ?><START><er></er></START>
May be this will help to shed a light on the problem.
Pavel

Similar Messages

  • Problem with migrating data (SQL Server 2k offline data capture)

    Hi there,
    I try to migrate a SQL Server 2000 database to Oracle 10g (R2) using offline data capture. I following the online Tutorial “Migrate from Microsoft SQL Server to Oracle Database 10g Using Oracle Migration Workbench” and everything seems OK until “Migrating the tablespaces, users and user tables to the destination database” http://www.oracle.com/technology/obe/10gr2_db_vmware/develop/omwb/omwb.htm#t5. I have two errors on 2 (out of 85 ) tables saying “ Failed to create default for Table :
    ORA-00907: missing right parenthesis”. I checked the column/table names and they seems OK. I don’t understand why I got these errors. However, when I checked the ‘SA’ Schema in Oracle Enterprise Manager Console, I can see all the tables (including the two problem ones).
    I then carry on to the next step and tried to migrate data to the destination database (http://www.oracle.com/technology/obe/10gr2_db_vmware/develop/omwb/omwb.htm#t6) I am now stuck on step 5. “…copy the files from c:\omwb\data_files to the c:\omwb\Omwb\sqlloader_scripts\SQLServer2K\<timestamp>\Oracle directory…” I cannot find the ‘c:\omwb\data_files’ directory, in fact, I don’t have a directory called ‘data_files’ on my machine. I noticed the ‘c:\omwb\Omwb\sqlloader_scripts\SQLServer2K\<timestamp>\Oracle’ directory contains the all the 85 [tableName].ctl files (plus two other files: ‘sql_load_script’ and ‘sql_load_script.sh’). From the screenshots online, ‘c:\omwb\data_files’ seems contains files with the same name as the [tableName]. Therefore, I did a search with [tableName] but cannot find any apart from the [tableName].ctl file in the ‘c:\omwb\Omwb\sqlloader_scripts\SQLServer2K\<timestamp>\Oracle’ directory. What should I do next?
    OS: windows 2003 with SP1
    Oracle: 10g Release 2
    SQL Server 2000
    Any help would be extremely appreciated.
    Helen

    Helen,
    Sorry, I am new here. Could you please tell me (or point me to the related documents about) how to output the Oracle model as a script?Action-> Generate Migration Scripts
    And what do you mean by ‘The default conversion may have failed’? Do you mean data type mapping? I went through all the columns and tables and checked the data types in the Oracle model already. The processing for the default for a table column could have something the basic workbench default parser cannot handle.
    I hope you are finding the workbench to be a productive aid to migration.
    Regards,
    Turloch
    Oracle Migration Workbench Team

  • Creation of a generic extractor and data source for the FAGLFLEXA table

    Hi All,
    Need to create a generic extractor and data source for the FAGLFLEXA table to support AR reporting. This table contains the necessary profit center information to perform LOB reporting against the AR data.
    Please advice on how to do this.
    Regards, Vishal

    Hi Vishal,
    Its seems a simple a work out.
    1.Go to RSO2 & choose the relevant option ie. whether you want to create Transactional DS, Master Data DS or Text DS.
    2. Name it accordingly & then create.
    3. Give description to it & then give table name FAGLFLEXA.
    4. Save it & activate. If you need it to be delta enabled then click over Delta & you can choose accordingly.
    If you still face some problem then do mail me at [email protected]
    Assign points if helpful
    Regards,
    Himanshu

  • How to find unsued Transfer Rules and Data Sources for a Master Table...??

    How to find unsued Transfer Rules and Data Sources for a Master Table...?? My requirement is i need to delete those Transfer rules and Data Sources which are not in use

    Hi
    Go to manage of the Text or attirbute of the master data object, see what are being loaded daily from there and delete the remaining.
    Cheer
    Ans as expected, Please reward

  • SQL Server 2012 Change Data Capture for Oracle by Attunity support for Oracle 12

    I would like to know if there are any plans for SQL Server 2012 Change Data Capture for Oracle by Attunity to support versions of Oracle 12 and if by when.

    I have asked from the author of
    http://blogs.msdn.com/b/mattm/archive/2012/03/26/cdc-for-oracle-in-sql-server-2012.aspx about this.
    I will either ask him to answer here or I would be a messenger.
    Balmukund Lakhani | Please mark solved if I've answered your question, vote for it as helpful to help other users find a solution quicker
    This posting is provided "AS IS" with no warranties, and confers no rights.
    My Blog |
    Team Blog | @Twitter
    Author: SQL Server 2012 AlwaysOn -
    Paperback, Kindle

  • Data import for CAF generated tables

    Hello,
    I developed a CAF application with custom Web Dynpro UI. As the persistency up to now just consists of "Local persistency", all the data is stored in CAF tables on my NW2004s system. I know that I can create records "by hand" in the Service Browser. But it's time consuming and I have more than 10.000 records.
    How can I import data to these tables? Are there any upload functions for csv/xls or any other file types?
    Thanks and regards
    Joschi

    hi all,
    As u all discussed about importing data into sql tables form CAF database,i understud the concept.
    can you please provide me step by step approach or any document to do this practically.
    im new to CAF development.I want to practice a scenario on CAF Remote persistency,for that
    i want to utilise sql tables into CAF entity services and develop some functionality(Using CRUD) methods.
    i konw that to utilse sql tables,call those tables into java application and expose it as webservice and using caf external service we have to capture that webservice.
    is that correct approach..
    can anyone explain the clear procedure..(or any document)
    how can i achieve remote persistency of caf in  case of sql server.
    Thanks & Regards
    sowmya.

  • Switch data sources for only 1 table?

    Greetings. In AS 2012 Tabular, I see that I have 3 existing connections. One connection is used as the source for most of the tables, but the other two connections are the source for one table each. I'd like to change the source for those two other
    tables to use the main data connection, but can't figure out how?
    My logical guess would be in the table properties, but that's not it.
    Any ideas?
    TIA, ChrisRDBA

    Greetings. In AS 2012 Tabular, I see that I have 3 existing connections. One connection is used as the source for most of the tables, but the other two connections are the source for one table each. I'd like to change the source for those two other
    tables to use the main data connection, but can't figure out how?
    My logical guess would be in the table properties, but that's not it.
    Any ideas?
    TIA, ChrisRDBA
    Hello,
    If your three data connections from different database, I don't think we can combine three data connections into one in Tabular model design surface. If you need to only use one data connection for your Tabular model, please try to combine your underlying
    tables into one database.
    Please point out if I have something misunderstood.
    Regards,
    Elvis Long
    TechNet Community Support
    Thanks Elvis. As stated above, all tables reside in one DB.
    TIA, ChrisRDBA

  • Change Data Capture for XML

    We have an XML file being created every week on the mainframe. This file is loaded to Oracle Database. Initially we were performing a refresh of a table, due to business reasons, we need to load only the changes from this XML file to our stage database.
    Then, the changes incorporated to the stage needs to be applied to our warehouse database.
    Is this possible? We cannot implement Change Data Capture on the Mainframe side.

    >
    they are not oracle DBs anyway
    >
    Then why are you posting in an Oracle forum and using Oracle terms like 'change data capture'?
    >
    3. don't want to serialize my inserts because since I am not sure I can keep up with the insert rate
    >
    That doesn't even make any sense. Oracle can generate sequence numbers faster than you can use them. Just put a trigger on the tables.
    You need to provide an example and explain how the data gets into a DB and how it gets out.

  • Issue in data replication for one particular table

    Hi,
    We have implemented streams in out test environment and testing the business functionalities. We have an issue in data replication for only one custom table all other tables data replications are proper no issue. When we do 100 rows update data replication is not happening for that particular table.
    Issue to simulate
    Update one row -- Replication successful.
    100 rows update -- After 3-4 hrs nothing happened.
    Please let me know did any of you have come across similar issue.
    Thanks,
    Anand

    Extreme slowness on apply site are usually due to lock, library cache locks or too big segments in streams technical tables left after a failure during heavy insert. these tables are scanned with full table scan and scanning hundreds of time empty millions of empty blocks result in a very big loss of performance, but not in the extend your are describing. In your case it sound more like a form of lock.
    We need more info on this table : Lob segments? tablespace in ASSM?
    If the table is partitioned and do you have a job that perform drop partitions? most interesting is what are the system waits nd above all the apply server sessions waits. given the time frame, I would more looks after a lock or an library cache lock due to a drop partitions or concurrent updates. When you are performing the update, you may query 'DBA_DDL_LOCKS', 'DBA_KGLLOCK' and 'DBA_LOCK_INTERNAL' to check that you are not taken in a library cache lock.

  • Data entry for Z... table

    I created a table with 3 fields and want to maintain its data. I know that use SE54 and then SM30 to do that, but I got this error when I tried to maintain data in SM30.
    View/table ZREGION can only be displayed and maintained with restrictions
    Message no. SV792
    Diagnosis
    You tried to call a maintenance dialog for the view/table ZREGION, for which the table maintenance is only allowed with restrictions. You can only edit the data in the environment of another program or a view cluster.
    Could anybody give me a step by step solution for a user table? Any replies are greatly appreciated!

    Hi,
    I think while generating the table maint, you have chosen the option.
    Just relook at the Table maint, and change this property.
    That should fix the problem.
    Regards,
    Ravi
    Note : Please mark the helpful answers

  • Help needed: Data source for Facilities management (tables VTB_ASGN*)?

    Hi all,
    I am looking for a data source for facilities management in R/3 (PI 2004_1_470)
    The tables are concerning VTB_ASGN* etc.
    every feedback is highly appreciated and rewarded.
    Thanks and regards,
    Sally

    Hello Sally,
    Did you ever find a solution to this issue?
    My client is also looking to do reporting on facilities in BW and integrate this data into liquidity planning in SEM-BPS. However, it looks like this will have to be a custom extractor calling one of the two function modules either FTR_FC_GET_COMMITMENT or FTR_FC_GET_COMM_DRAW_FEE.
    Gregory

  • Change Data Capture for Oracle 9i

    <p>If your environment must keep large amounts of data current, the Oracle CDC feature is a simple solution to limiting the number of rows that Data Integrator reads on a regular basis. A source that reads only the most recent operations (INSERTS, UPDATES, DELETES), allows you to design smaller, faster delta loads.</p><p>With Oracle 9i, Data Integrator manages the CDC environment by accessing Oracle&#39;s CDC packages. Oracle publishes changed data from the original table to its CDC table. After a CDC table receives published data, you can create subscriptions to access the data. Data Integrator Designer allows you to import CDC tables and create subscriptions for them.</p><p>For more information see "Techniques for Capturing Changed Data" of the Data Integrator Designer Guide.</p>

    <p>Werner did a nice step-by-step instruction on how to set up CDC in Oracle and use this inside Data Integrator. </p><p><a href="http://www.consulting-accounting.com/time/servlet/ShowPage?COMPANYID=43&ELEMENTID=1641" target="_blank">http://www.consulting-accounting.com/time/servlet/ShowPage?COMPANYID=43&ELEMENTID=1641</a></p>

  • Offline data capture

    HI..
    I have created the .dat files, using the offline capture scripts.
    From the capture wizard, I clieck offline capture, provide it the files.. window appears which says it is loading the source model.
    After everything is done, I cannot see anything in the souce model on the main pane of the workbench. It says no objects captured. There were no errors in the log file. I am using the default repository..
    any clue, on how to fix this.
    manoj

    HI Manoj,
    I think your problem may be delimiters.
    The column and row delimiter used to seperate out the columns and rows of data in the dat file, the same characte ris also present in the data itself.
    Sybase 12 by default uses
    OFFLINE_CAPTURE_COLUMN_DELIMITER=§
    OFFLINE_CAPTURE_ROW_DELIMITER=¥
    I think your data may contain one of those characters somewhere. So when the OMWB comes to read the dat files it mistakenly thinks there is an extra column or row.
    To solve this problem you can change the column and row delimiters used.
    1) Change the delimiter in the offline capture scripts and then rerun them to generate data files with the new delimiteres
    - omwb\offline_capture\Sybase12\OMWB_OFFLINE_CAPTURE.BAT
    I would change this
    rem ** SET THE VALUE FOR THE OFFLINE_CAPTURE_COLUMN_DELIMITER
    rem ** THIS VALUE SHOULD MATCH THE VALUE SET FOR THE SAME VARIABLE IN OMWB_INSTALL_DIR\bin\omwb.properties
    set OFFLINE_CAPTURE_COLUMN_DELIMITER=§
    rem ** SET THE VALUE FOR THE OFFLINE_CAPTURE_ROW_DELIMITER
    rem ** THIS VALUE SHOULD MATCH THE VALUE SET FOR THE SAME VARIABLE IN OMWB_INSTALL_DIR\bin\omwb.properties
    set OFFLINE_CAPTURE_ROW_DELIMITER=¥
    to this
    rem ** SET THE VALUE FOR THE OFFLINE_CAPTURE_COLUMN_DELIMITER
    rem ** THIS VALUE SHOULD MATCH THE VALUE SET FOR THE SAME VARIABLE IN OMWB_INSTALL_DIR\bin\omwb.properties
    set OFFLINE_CAPTURE_COLUMN_DELIMITER=º
    rem ** SET THE VALUE FOR THE OFFLINE_CAPTURE_ROW_DELIMITER
    rem ** THIS VALUE SHOULD MATCH THE VALUE SET FOR THE SAME VARIABLE IN OMWB_INSTALL_DIR\bin\omwb.properties
    set OFFLINE_CAPTURE_ROW_DELIMITER=Ñ
    These 2 characters occur very little in real data, so they are the next best choice of character for delimiters. But you can use any one you want. (has to be a single character though)
    2) Next you have to tell the OMWB what delimiter characters to expect
    a.
    Using a text editor, open the omwb.properties file located in the OMWB_install_dir/Omwb/bin directory. If it doesnt exits, create a new file called omwb.properties
    b.
    Edit or add the following fields:
    OFFLINE_CAPTURE_COLUMN_DELIMITER="º"
    OFFLINE_CAPTURE_ROW_DELIMITER="Ñ"
    In the previous lines, delimiter_column is your choice of column delimiter and delimiter_row is your choice of row delimiter.
    c.
    Save the file, then exit.
    Migration Workbench is now enabled to handle the new character encoding.
    You can try and load up your captured database now.
    I hope this works,
    Dermot.

  • 00933 error when clicking on the data tab for the selected table

    Hi,
    I'm getting the following error when selecting a table from the tree view then clicking on the data tab:
    An error was encountered performing the requested operation:
    ORA-00933: SQL command not properly ended
    00933.00000 - "SQL command not properly ended"
    *Cause:
    *Action:
    Vendor code 933
    The exact same tables created from sqlplus for a different user in the same database work fine. Note, I'm not typing any SQL in myself. It would appear this happens on all tables created as this user. I cannot see any difference between permissions for this user and others which work but the failing user does have a "-" in the username and password which I'm suspicious of.
    This is SQL Developer v 1.2.0 (build 29.98) (the most recent I believe, Oracle Database 10g Express Edition Release 10.2.0.1.0 and Linux ubuntu.
    Thanks
    Martin

    "-" is not allowed in identifiers and is almost certainly the cause of your problem. This can be got round by quoting i.e. using "STUPID-NAME" instead of STUPID-NAME.
    I'm slightly surprised that SQLDeveloper doesn't quote the statement. Perhaps it quotes table names but not user names.

  • PR release date capturing for all levels

    Hi,
    I have activated PR release strategy at Header level.  There are about 4 levels of release are there.  In custom PR print program, I want to capture the details of all levels corresponding release dates.  Can you help me how I can go about it.
    Munna.

    Hi
    PR release date is not relevant to Release strategy.
    Purchase Requisition Release Date
    Specifies the date on which the purchase order should be initiated on the basis of the purchase requisition.
    The release date is based on:
    The purchasing department processing time defined for the plant
    The planned delivery time from the material master record or purchasing info record
    The delivery date
    The goods receipt processing time from the material master record
    Note
    The planned delivery time from the purchasing info record and the GR processing time are only taken into account if the purchase requisition was generated via materials planning.
    Example
        Processing time   Planned del.   GR processing
           Purchasing         time           time
    Release       PO                    Delivery    Date
    date         date                   date    required
    Date required:                     10.01.96
    GR processing time:                 2 days (working days)
    Planned delivery time:             10 days (calendar days)
    Purchasing dept processing time:    2 days (working days)
    For the material to be available on the date it is needed, the purchase requisition must be released on 09.11.96 (requirement date less GR processing time, planned delivery time, and purchasing department processing time).
    Hope it helps
    Thanks/karthik

Maybe you are looking for

  • No .Mac Web Gallery visible in iWeb

    I published a gallery with 3 albums in it and can see them in Safari. However, when I try to use Insert/.Mac Web Gallery, there is noting available to insert. How do I make the Gallery visible to iWeb. Also the .jpg files are low quality. Is there so

  • I need your serious ipod help

    Alright, I haven't turned my ipod on in awhile (dont know if that has anything to do with it)...but it wouldn't turn on. So I charged it...and the ipod works all dandy when charging...but once when I take it off the adapter the ipod is completely dea

  • Shopping cart default delivery address based on company code

    Hi All, As a standard behaviour of SRM, currently delivery address in shopping cart is defaulted from attribute (ADDR_SHIPT). But in our case we use cross company purchasing where requisitioners buy for different company codes. Some times, they forgo

  • Importing Essbase cube in OBIEE

    Hi Experts, I have imported the Essbase cube in to the OBIEE physical layer but the attribute dimension and the base dimension are not mapped. Please let me know how I can map the base dimension and the attribute dimension of Essbase cube in BMM laye

  • Enhancement Error at Selection Screen of standard program

    Hi, can implicit enhancement be added to selection screen? When I tried to add a parameter after first selection-screen block or inside the block, always get error message during activation. The standard program i try to enhance is report LSO_RHXBUCH