MS Access Memo fields to Oracle 9i

I've successfully migrated an Access 97 database to Oracle 9i, but I'm having trouble connecting to certain tables from MS Access. The common feature of these tables is that they have notes fields of type Memo
These fields have been correctly migrated to CLOB in Oracle, but when I look at them in the Design view of Access, they show as being of type Text
I've attempted to create another Access linked table to one of these Oracle tables, but the notes field still shows as type Text
When these linked tables are opened, I get one of two errors:
"Reserved Error (-7711), there is no message for this error" followed by "SRS Belfast MI can't open this table in Datasheet view"
or
"ODBC - call failed" followed by "[Microsoft][ODBC Driver Manager] Function sequence error (#0)" followed by the Datasheet view opening will all the fields displaying #Name? as values
Any suggestions?

Hi Richard,
we have not been able to reproduce this problem in-house. Can you forward the following information to [email protected] please ? :
1. Information on how to produce a small testcase that causes the same problem.
2. The error.log file which can be found in %OMWB_HOME%\bin.
3. Details on the version of the ODBC driver you are using.
Thank you,
Tom.

Similar Messages

  • Access Memo Fields to Oracle 8.0

    I am converting an MS Access application to Oracle 8.0. The MS
    Access application has tables with multipe "Memo" fields some
    over 6000 characters long. Oracle will only allow one "Long"
    field per table. How can I get around this? what is the VARCHAR2
    maximum length now in 8.0?
    null

    Michael,
    Apologies for taking so long to get back to you.
    The situation is as follows :
    Length of VARCHAR2 in Oracle 8.x is 4000 characters.
    The solution would be to use LOBs. Unlike LONG columns from
    Oracle 8.x onwards you can have any number of LOB columns in
    a table. The size of each LOB could be upto 4GB, if my memory
    serves right. LOBs could be either External or Internal meaning
    you can store them either inside the database or as file on the
    file
    system but under the database's transactional control.
    Advantages of LOBs are you can replicate them, index them
    using the context option.
    Regards,
    Marie
    =====
    Michael S. Kenny (guest) wrote:
    : I am converting an MS Access application to Oracle 8.0. The MS
    : Access application has tables with multipe "Memo" fields some
    : over 6000 characters long. Oracle will only allow one "Long"
    : field per table. How can I get around this? what is the
    VARCHAR2
    : maximum length now in 8.0?
    Oracle Technology Network
    http://technet.oracle.com
    null

  • Access BLOB field in Oracle database

    Dear Sir,
    I am using Pro* C to access oracle blob fields
    I write "12345" to Oracle Blob field, when I read it it is fine.
    I then modify this record to "ABC", when I read it, I get "ABC45",
    How can I set blob field to the initial state to modify BLOB field correctly ?
    Many Thanks
    Liang

    Initially you write 12345 and when you modify you are modiyfing only 3 bytes from offset 1 and hence the results. When you modify oracle will not erase the complete lob you need to use erase before you modify (write)
    Following doc give you an insight into pl/sql lob functions
    http://download.oracle.com/docs/cd/B28359_01/appdev.111/b28419/d_lob.htm#sthref4428
    Following is the Pro*C Lobs Guide
    http://download.oracle.com/docs/cd/B28359_01/appdev.111/b28427/pc_16lob.htm#i998068

  • Conversion problems with Memo field in Access database

    I have the latest version of Migration Workbench and the Access 2000 plug-in.
    I am using an ODBC driver, and the SQL conversion requirements for converting
    to a native driver seem too time and labor intensive to be feasable.
    My problem is that I have some "Memo" columns in my Access database containing
    over 200,000 rows that I need converted to Oracle. This is something I'm going
    to need to do over-and-over again for different Access databases.
    I gave up on mapping the Access "Memo" datatypes to Oracle "Clob" datatypes. It
    was either taking forever or taking WAY too long.
    Then I tried mapping the Access "Memo" datatypes to "Varchar2(4000)" and although
    it completed it took longer than my customer will like (think 1/2 an hour just for
    all the rows of one memo column in one table).
    Finally I tried going into the original Access database and converting the "Memo"
    column into a "Text" column. It wouldn't work, I kept getting the error:
    =====
    Microsoft Access can't change the data type.
    There isn't enough disk space or memory.
    =====
    even though I had over 10GB of free disk space and only 1/3 of my memory was being
    used. I assume this is a memory-max limitation in Access 2000.
    Any suggestions for speeding the conversion? Could I stagger using Migration workbench
    to create the tables. . .then use SQL*Loader to load the data. . .then use Migration
    workbench again to put in the idexes, etc? Would this be faster even if it worked?
    Thanks,
    Aaron Chawla

    I have the latest version of Migration Workbench and the Access 2000 plug-in.
    I am using an ODBC driver, and the SQL conversion requirements for converting
    to a native driver seem too time and labor intensive to be feasable.
    My problem is that I have some "Memo" columns in my Access database containing
    over 200,000 rows that I need converted to Oracle. This is something I'm going
    to need to do over-and-over again for different Access databases.
    I gave up on mapping the Access "Memo" datatypes to Oracle "Clob" datatypes. It
    was either taking forever or taking WAY too long.
    Then I tried mapping the Access "Memo" datatypes to "Varchar2(4000)" and although
    it completed it took longer than my customer will like (think 1/2 an hour just for
    all the rows of one memo column in one table).
    Finally I tried going into the original Access database and converting the "Memo"
    column into a "Text" column. It wouldn't work, I kept getting the error:
    =====
    Microsoft Access can't change the data type.
    There isn't enough disk space or memory.
    =====
    even though I had over 10GB of free disk space and only 1/3 of my memory was being
    used. I assume this is a memory-max limitation in Access 2000.
    Any suggestions for speeding the conversion? Could I stagger using Migration workbench
    to create the tables. . .then use SQL*Loader to load the data. . .then use Migration
    workbench again to put in the idexes, etc? Would this be faster even if it worked?
    Thanks,
    Aaron Chawla

  • URGENT!Can I user a THIN jdbc driver to access a CLOB field from oracle 8.0.5 DB?

    URGENT!Can I user a THIN jdbc driver to access a CLOB field from oracle 8.0.5 DB?

    I think you'd need to contact Oracle support to get access to older versions of the driver.
    Since 8.0.5 isn't supported any longer, however, is it possible for you to update your Oracle client to one of the supported releases-- 8.1.7 or 9i?
    Justin

  • What is the equivalent Oracle datatype for the access Memo type?

    Hi,
    I would like to know what the oracle equivalent of the Access Memo datatype is.
    Thanks
    Adam

    Yep. Look at this example:
    SQL>create table x(y clob);
    Table created.
    SQL>insert into x values (
      2  'This is a very long string
      3  and that was an carriage return - entered in notepad'
      4  )
      5  /
    1 row created.
    SQL>select * from x;
    Y
    This is a very long string
    and that was an carriage return - entered in notepad
    SQL>

  • Anyone else having issues writting string "memo" fields to Access 2007-2010 or SQL Server Express 2008-2012?

    I have a data cluster that I am working to write to either Access 2007 thru 2010 and/or SQL Server Express 2008 thru 2012 all on a Windows 7 64 bit OS. Some of the elements are long strings that I put delimited data for the raw trace data. The fields can be 10's of KB long. With Access I cannot stuff any string >255char into the memo field. With SQL Server Express 2008R2 - 2012 I found that if the strings are >32767 they again fail to fill the memo fields. The Access write or reads work fine otherwise, just not the listed offending fields in the cases mentioned.  Anyone else having the issue or a solution?

    If the long string field has repetitive data, you might be able to compress it using GZIP to reduce it's size below the 32K limitation you mentioned.
    GZIP compress/uncompress of string using .NET
    Now is the right time to use %^<%Y-%m-%dT%H:%M:%S%3uZ>T
    If you don't hate time zones, you're not a real programmer.
    "You are what you don't automate"
    Inplaceness is synonymous with insidiousness

  • SQL Loader problem in loading MS Access Memo Data

    HI guys!
    I am having trouble loading my Access data into my Oracle Database.
    I export first the data from Access to an Excel file and then I save as the Excel file as a CSV file.
    Then by using SQL Loader I load the csv file to the Oracle.
    load data
    REPLACE
    into table TBL1
    fields terminated by ','
    TRAILING NULLCOLS
    NAME,
    ID
    NOTES CHAR(4000),
    The problem is with the NOTES data. It is declared as Memo type in Access and I declared it as VARCHAR2(1000) in Oracle. If I don't have any carriage returns (Enter Key) in the data, the data is loaded successfully but when it has some carriage returns (Enter Key) the data is cut (continued to next line) and wrong data is being loaded...
    BUt if I just copy-paste manually the data, it is loaded successfully..
    How do I deal with this problem.I have over 20,000 data so I can't do copy-paste..
    Appreciate any help..
    Thanks!

    Did you try to specify the "end of record " string (str <terminator string> function) , or Trimming whitespaces function ? Please refer to "Oracle Database utilities" for more information about it .

  • DTS Import of MDB in SQL Server 2000 Drops Memo Field Data

    I have used DTS in SQL Server 2000 to import an MDB filed (MS ACCESS) of a table.  When the table is imported the primary key is lost and the memo field data is completely gone. 
    I use the tranformation option in the DTS wizard to add the primary key and make sure the data type for the memo field is varchar and has a size of 8000.  I need that large size since I am storing lots of html code. 
    When I preview the data I see the html code that is supposed to get imported.  However, when I return all rows from the table in Enterprise Manager the field is empty.
    So I tried to manually copy the data from the MS Access Database into SQL Server.  Could not figure out if SQL Server has an interface like MS Access to simply copy data into a table.  So I linked to the tables from MS Access to the SQL Server table. 
    When I opened the linked table I see the data in the description field.  However, if I return the rows from within SQL Server no data is present.
    I have some ASP code trying to read the data in the SQL Server table.  However, nothing is returned and when I run the SQL Statement, nothing gets returned.  The SQL statement returns all rows.  All the other data is present but nothing in the description field. 
    What am I doing wrong?  Any suggestions anyone, please!
    TIA 

    It is important to know the version of MS Access. I will recomend you to use nvarchar datatype instead of varchar. May be the description field contain unicode characters.
    Refer this link to understand more on datatype mapping between access source and sql server destionation.
    http://blogs.msdn.com/b/ssma/archive/2011/03/06/access-to-sql-server-migration-understanding-data-type-conversions.aspx?Redirected=true
    Regards, RSingh

  • How To Increase the Size of Memo Fields 4000 Char in WebCenter 11g?

    How To Increase the Size of Memo Fields greater than 4000 Char in WebCenter Content (UCM) 11g?
    I was able to increase the size of Memo field from 2000 Char to 4000 by setting parameter value MEMOFIELDSIZE=4000 . But the requirement is to increase the size more than 4000 around 7000 char but Database(Oracle 11G) is not supporting more than 4000 char.
    Is there any other way to increase the size of Memo field?
    Thanks in Advance!!
    Regards
    Ram

    Hello All,
    Thanks for your responses.
    Let me explain the scenario again in detail:
    1. we have three meta-data fileds xProductCategory,xProductFamily and xProducts
    2. These fields are related with DCL feature and corresponding values are being stored in corresponding custom Views
    3. xProducFamily is dependent of xProductCategory and xProduct is dependent on xProductFamily
    4. currently, there are around 250 Product values and average name of Product can be of 25 Char, so if we are selecting all product values for some Contents then Product Values are being stored in DocMeta table as comma separated values with one space, so total characters are being very high than 4K Char.
    5. Currently view that is storing the Product values, stores "Name" column instead of ID(like 1,2,3....)
    6. And our Content Server is integrated with Oracle WebSenter Portal, where logic has written to retrieve the contents based on "Product Name" instead of "Product ID" and already we are in UAT phase.Currently we are not in situation to revert the changes from Product Name to Product ID.
    7. Also if we are storing the values on the basis of Product ID, then again the issue may arise in Future, if the number of Products increased.
    Hoping the issue explanation is clear now and waiting for some possible solution for increasing the length of memo field.
    Thanks in Advance !!
    Regards
    Ram

  • Problem displaying memo field

    I have a table in MS access with a memo field. Everything the
    user inputs in the field is recorded in the table. When I try to
    output the memo field I am on getting part of the field. Attached
    is the code for displaying the data from the acty_comments field.
    Also below is the actual data that is in the field. Can anyone tell
    me what is going on.
    Thanks
    Actual Data:
    "The answers to the questions are below in italic magenta.
    The answers reiterate FAR sections. As they are parts of
    published regulations, there is no sense in referring to them
    ""official GSA policy"" as if there is some other authority for the
    informa
    - here is where it stops-
    - here is what should be next
    tion.
    Ask Acquisition is a convenience service of the Office of
    Acquisition Policy; not as an avenue for policy decisions. If they
    need a formal response from the Office of Acquisition Policy, you
    will need to send any formal inquiries to
    - more data -

    are you using GROUP BY? or aggregating in some way? using
    DISTINCT?
    There is a limitation in Access if so.
    I know there is a M$ technote somewhere, but I can't find it
    right now.
    Here's a link that might help
    http://allenbrowne.com/ser-63.html
    Tim Carley
    www.recfusion.com
    [email protected]

  • Problem accessing TABLE fields in SELECT statement

    Hi,
    We are currently using Oracle Database 10.2.0.2.0.
    In the following code, using a function to access TABLE fields works, but not when accessing the table fields directly (in the latter case, I get a no data found exception).
    Why is that?
    Thanks for your help.
    Olivier
    PS: I do have a lengthy explanation of why we would want to do that as well as the full packages, etc... But I didn't want to bore you to no end.
    I'll post it if required.
    CREATE OR REPLACE PACKAGE PA_TEST_DEVTBL AS
       TYPE TBL_ROLCODE IS TABLE OF LANROLE.ROLCODE%TYPE INDEX BY BINARY_INTEGER;
       TYPE TBL_ROLLABEL IS TABLE OF LANROLE.ROLLABEL%TYPE INDEX BY BINARY_INTEGER;
    end PA_TEST_DEVTBL;
    CREATE OR REPLACE PACKAGE BODY PA_TEST AS
       -- Array containing the selected data
       TblRolCode PA_TEST_DEVTBL.TBL_ROLCODE;
       TblRolLabel PA_TEST_DEVTBL.TBL_ROLLABEL;
       -- Functions created to retrieve each array data
       FUNCTION F_GET_ROLCODE( nIndex NUMBER ) RETURN LANROLE.ROLCODE%TYPE IS
       BEGIN
          RETURN TblRolCode( nIndex );
       END F_GET_ROLCODE;
       FUNCTION F_GET_ROLLABEL( nIndex NUMBER ) RETURN LANROLE.ROLLABEL%TYPE IS
       BEGIN
          RETURN TblRolLABEL( nIndex );
       END F_GET_ROLLABEL;
       PROCEDURE S_TEST (
    -- THIS DOESN'T WORK (ORA-01403: no data found)
    OPEN cReturn FOR
    SELECT TblRolCode( ROWNUM ),
    TblRolLabel( ROWNUM )
    FROM TABLE( CAST( tblRows AS T_TBL_NUMBER ) );
    -- BUT THIS WORKS !!!
    OPEN cReturn FOR
    SELECT F_GET_ROLCODE( ROWNUM ) AS ROLCODE,
    F_GET_ROLLABEL( ROWNUM ) AS ROLLABEL
    FROM TABLE( CAST( tblRows AS T_TBL_NUMBER ) );
    ..

    well it could be managed by simple HTML tags or simple javascript properties and events itself...
    All you have to do is encode URL and pass it as the request parameters to the next page.
    Checkout a simple example down below.
    MainTable.jsp:
    ============
    <table>
    <thead>
    <tr>
    </tr>
    </thead>
    <tbody>
    <c:forEach var="DTOBean" items="${request.dbList}">
      <tr>
         <td><c:out value="${DTOBean.rowId}"/></td>
         <td onClick="window.location.href='/testpage.jsp?col2='+escape('<c:out value="${DTOBean.col2}"/>')+'&col3='+escape('<c:out value="${DTOBean.col3}"/>'); " ><c:out value="${DTOBean.col1"/></td>
        <!-- or try with to work with simple hyperlink
          <td><a href="# " onclick="window.location.href='/testpage.jsp?col2='+escape('<c:out value="${DTOBean.col2}"/>')+'&col3='+escape('<c:out value="${DTOBean.col3}"/>');"><c:out value="${DTOBean.col1"/></a></td>
        -->
          <td><c:out value="${DTOBean.col2"/></td>
          <td><c:out value="${DTOBean.col3"/></td>
      </tr>
    </c:forEach>
    </tbody>
    </table>
    --------------------------------------------------testpage.jsp:
    ==========
    Column2 : <c:out value="${param.col2}"/>
    Column3 : <c:out value="${param.col3}"/>
    --------------------------------------------------Hope that might help.
    REGARDS,
    RaHuL

  • Problems displaying rtf memo fields

    Post Author: Davidm
    CA Forum: General
    We use Crystal Reports X to run reports on an Access database with a significant number of rtf memo fields. We use Total Access Memo to allow extended use of rtf within Access as our users require the additional formatting capabilities that this offers.
    However, there are considerable problems displaying items like bullet points, tables and hyperlinks in the Crystal Viewer with tables in particular coming out in a real mess with all entries in the table being displayed in a list with a small square marking each cell at the end of each line.
    Curiously, if you preview the report in the full version of Crystal X, it appears a bit better with the hyperlinks still underlined (bullet points still vanish) and rows in rtf tables are at least presented as a row even if the columns are not ordered and mixed (with no bounding cells visible).
    Firstly, is this disparity of end result between full Crystal X Print Preview and the Crystal X Viewer fixable?
    Secondly, is the limited handling of rtf tags likely to be solved in the next release of Crystal?

    Hi Mathias,
    If I caught you correctly, you want to display data in Adobe forms in form of tabel, right?
    So, follow the steps:
    1. Insert one sub form on your adobe form.
    2. Set its type as "flow content" in object->subform property.
    3. Set flow direction as "Table".
    4. Insert another subform inside this subform.
    5. set its type as "flow content" and flow direction as "Table row".
    6. Now, choose binding tab, and there check "repeat subform for each Data item check box" and specify min. count for your rows.
    7. Now, insert your column fields inside this sub form once.
    8. Format its look and feel as you want.
    When you run this application, it will show you multiple data as table on Adobe form.
    Regards,
    Bhavik

  • Blank space updatation to Collection field in Oracle 10g

    Hi,
    we have the Oracle 9i database in the dev environment and Oracle 10g in Production environment.
    In Oracle 9i, the below scenario works find, but in Oracle 10g it won'nt, The scenerio is ,
    Update table1
    set test_collection = ' '; -- works fine in Oracle 9i , but it wont works in Oracle 10g,
    Update tabel1
    set test_collection = null; -- works in both the versions.
    Please suggest,.
    raja k

    What is myv, is that the collection name ?? See post number 5 of this thread where I declared it.
    >
    In Oracle 9i, we need not to specify the collection
    name while updating the collection type field in the
    table. 9.2.0.6 was notoriously buggy and wasn't considered stable till 9.2.0.7 if I recall correctly. And now (thankfully) it's unsupported.
    The sql code was written in concern to Oracle 9i, now
    while migrating to Oracle 10g, we are getting the
    problem with this. The problem is , i don'nt have the
    access to Production environment, which is migrated
    to Oracle 10g, we develop the code on Oracle 9i, and
    will release to deploy on Oracle 10g. Develop in 9i and deploy to 10g! Do you really think that's a good setup? How about upgrading your development environment to match production.
    update table1 set test_collection = null; - this
    works fine in Oracle 9i ,
    update table1 set test_collection = ''; - this works
    fine in Oracle 9i ,
    but in Oracle 10g ,
    update table1 set test_collection = null; - this
    works fine in Oracle 10g ,
    update table1 set test_collection = ''; - this won'nt
    works in Oracle 10g , Yes, I already understand what your issue is.
    as you suggested, do we need to specify the
    collection name while updating the collection type
    field. in Oracle 10g, In Oracle 9i , it won'nt
    required ?? Yes, this would be the correct way to do it.
    Karthick,
    because null is not same as ''
    SQL> select * from dual where ''=null
    2 /
    no rows selectedErrrm, yes it is actually...
    SQL> select 1 from dual where '' is null;
    1
    1
    SQL>
    ... if you use the correct condition for checking nulls.

  • SQL conversion of MEMO field

    I have a web site that builds pages based on content in an
    Access DB. One of the fields is MEMO and contains body copy for the
    page (usually 300-400 words per record). This bodyCopy field is
    HTML tagged IN the record (in case that matters). So I import the
    whole database into SQL. Fine. The pages display ok - All the data
    in the bodyCopy field is STILL in there, I can see it on the
    webpages. BUT, I cannot see, or edit the cells in SQL Enterprise
    manager. If I add a new record similar bodyCopy pastes are
    truncated. SQL converted the MEMO field to ntext(16). I have tried
    text(16) and nvarchar(8000) with the same results. I need to be
    able to edit the existing cells, and input similarly sized data
    chunks in with new records...
    Can anyone shed some light on this issue?

    SQL is a bit different in managing the data directly than
    Access was.
    ntext should be fine for any previous memo fields.
    If it tells you that you cannot edit the cells, it is usually
    because the Key has not been defined. Importing from Access will
    bring in all your data, but it won't necessarilly apply the Key
    identifier. Without a key it won't let the data be edited, at least
    not directly.
    If it is showing up in the web page OK then the data would
    not appear to be truncated, just not all viewable in the SQl
    listing. If you try to enter too much data into a field in SQL it
    usually will simply refuse to do it.
    Have ou tried connecting to the SQL server via Access (ADP)?
    If you have been used to Access all this time, it can be a really
    nice way of working between the two and gives you a familiar way of
    doing things.

Maybe you are looking for