Error while accessing BSAD Table with dunning date
Hi ,
I developed a report for FI module accessing BSAD table with default customer ranges and for specific dunning dates - It ran for a very long time and timed out - (I know this is due to huge volume of data) -
Is there any way to access BSAD table easily with Dunning dates (Other than creating Index on it) ???
Or any standard function module available ??
Regards
Rajesh.
Hi
Try the below tables for the dunning data details:
MHND Dunning Data
MHNDO Dunning data version before the next change
MHNK Dunning data (account entries)
MHNKA Version administration of dunning changes
MHNKO Dunning data (acct entries) version before the next chang
SKS
Similar Messages
-
Error while accessing External table.
Hi All,
While accessing oracle external table. I created the table with the following query.
CREATE OR REPLACE DIRECTORY load_dir AS '\\oraaps\Exceldata\'
CREATE TABLE my_sheet
DEPTNO NUMBER,
DNAME VARCHAR2(14),
LOC VARCHAR2(13)
ORGANIZATION EXTERNAL
TYPE oracle_loader
DEFAULT DIRECTORY load_dir
ACCESS PARAMETERS
RECORDS DELIMITED BY NEWLINE
badfile load_dir:'my_sheet.bad'
logfile load_dir:'my_sheet.log'
FIELDS TERMINATED BY ','
MISSING FIELD VALUES ARE NULL
DEPTNO,
DNAME,
LOC
LOCATION ('my_sheet.csv')
)REJECT LIMIT UNLIMITED;
I am sure that the table and the directory got created because i can see the table in the SQL developer. But whenever i say select * from my_sheet i'm getting the following error in the log file.
LOG file opened at 10/16/06 14:48:21
Field Definitions for table mysheet
Record format DELIMITED BY NEWLINE
Data in file has same endianness as the platform
Rows with all null fields are accepted
Fields in Data Source:
DEPTNO NUMBER Terminated by ","
Trim whitespace same as SQL Loader
DNAME VARCHAR2(14),
Terminated by ","
Trim whitespace same as SQL Loader
LOC VARCHAR2(13)
Terminated by ","
Trim whitespace same as SQL Loader
KUP-04001: error opening file \\oraaps\Exceldata\mysheet.csv
KUP-04017: OS message: The data is invalid.
Please do reply..Its urgent from my project deliverable point of view.
Any help appreciated.
Thanks and Regards.
V.Venkateswara RaoIt is not an Oracle error/problem. The error message is quite specific ito the actual root cause of the problem:
KUP-04001: error opening file \\oraaps\Exceldata\mysheet.csv
KUP-04017: OS message: The data is invalid.
These are operating system errors. The operating system cannot access/open/read the specific UNC and/or file.
Fix it at o/s level and it will work when Oracle needs to make that o/s call. -
OWB 10g R1 : Error while accessing External Table
Dear All,
We have created few external and registered them using os user oracle.
Now , we have changed to user 'oratester' and reregistered the location.
oratester is having all the rights on the folder which the external table is refering.
Still , we are facing with the following error:
ORA-29913: error in executing ODCIEXTTABLEOPEN callout ORA-29400: data cartridge error KUP-04063: unable to open log file log_test.txt OS error Permission denied
If anybody is having any idea how to solve this error, please reply
Thanks in Advance
malleyah. Ive configured the accessing parameters.
The code generated , has the following
ACCESS PARAMETERS (
RECORDS DELIMITED BY NEWLINE
CHARACTERSET WE8MSWIN1252
STRING SIZES ARE IN BYTES
BADFILE ALL_DB_LOC_CMN_FLS_LOG_LOC:'bad_central.txt'
DISCARDFILE ALL_DB_LOC_CMN_FLS_LOG_LOC:'discard_central.txt'
LOGFILE ALL_DB_LOC_CMN_FLS_LOG_LOC:'log_central.txt'
FIELDS
TERMINATED BY '~'
OPTIONALLY ENCLOSED BY '"' AND '"'
Which OS does oracle use to create log/bad files? -
Error while importing a table with BLOB column
Hi,
I am having a table with BLOB column. When I export such a table it gets exported correctly, but when I import the same in different schema having different tablespace it throws error
IMP-00017: following statement failed with ORACLE error 959:
"CREATE TABLE "CMM_PARTY_DOC" ("PDOC_DOC_ID" VARCHAR2(10), "PDOC_PTY_ID" VAR"
"CHAR2(10), "PDOC_DOCDTL_ID" VARCHAR2(10), "PDOC_DOC_DESC" VARCHAR2(100), "P"
"DOC_DOC_DTL_DESC" VARCHAR2(100), "PDOC_RCVD_YN" VARCHAR2(1), "PDOC_UPLOAD_D"
"ATA" BLOB, "PDOC_UPD_USER" VARCHAR2(10), "PDOC_UPD_DATE" DATE, "PDOC_CRE_US"
"ER" VARCHAR2(10) NOT NULL ENABLE, "PDOC_CRE_DATE" DATE NOT NULL ENABLE) PC"
"TFREE 10 PCTUSED 40 INITRANS 1 MAXTRANS 255 STORAGE(INITIAL 65536 FREELISTS"
" 1 FREELIST GROUPS 1 BUFFER_POOL DEFAULT) TABLESPACE "TS_AGIMSAPPOLOLIVE030"
"4" LOGGING NOCOMPRESS LOB ("PDOC_UPLOAD_DATA") STORE AS (TABLESPACE "TS_AG"
"IMSAPPOLOLIVE0304" ENABLE STORAGE IN ROW CHUNK 8192 PCTVERSION 10 NOCACHE L"
"OGGING STORAGE(INITIAL 65536 FREELISTS 1 FREELIST GROUPS 1 BUFFER_POOL DEF"
"AULT))"
IMP-00003: ORACLE error 959 encountered
ORA-00959: tablespace 'TS_AGIMSAPPOLOLIVE0304' does not exist
I used the import command as follows :
imp <user/pwd@conn> file=<dmpfile.dmp> fromuser=<fromuser> touser=<touser> log=<logfile.log>
What can I do so that this table gets imported correctly?
Also tell me "whether the BLOB is stored in different tablespace than the default tablespace of the user?"
Thanks in advance.Hello,
U can either
1) create a tablespace with the same name in destination where you are trying to import.
2) get the ddl of the table, modify the tablespace name to reflect the existing tablespace name in destination and run the ddl in the destination database, and run your import command with option ignore=y--> which will ignore all the create errors.
Regards,
Vinay -
Error while accessing a Table on Oracle Database 10.2
Hi Experts,
We have a table that contains a CLOB datatype in one of its column. However when i tried to access the table i get the below error.
<b>Table Name:</b> discrete_jobs
<b>Error:</b>
(Error starting at line 1 in command:
select * from [email protected]
Error report:
SQL Error: ORA-22992: cannot use LOB locators selected from remote tables
22992. 00000 - "cannot use LOB locators selected from remote tables"
*Cause: A remote LOB column cannot be referenced.
*Action: Remove references to LOBs in remote tables.)
Pleas Help!
Regards,
Ravi RSee some work arounds - "How to select table from remote database having clob field</a>
-
Error while accessing a war with xmlbeans under WEB-INF/lib
Hi All,
I am trying to deploy a WAR file with XMLBean generated jar under WEB-INF/lib folder along with other jars.
I am getting following error
java.lang.NoClassDefFoundError: com/xx/DataServiceRequestDocument
at java.lang.Class.getDeclaredFields0(Native Method)
at java.lang.Class.privateGetDeclaredFields(Class.java:2291)
at java.lang.Class.getDeclaredField(Class.java:1880)
at java.io.ObjectStreamClass.getDeclaredSUID(ObjectStreamClass.java:1610)
at java.io.ObjectStreamClass.access$700(ObjectStreamClass.java:52)
at java.io.ObjectStreamClass$2.run(ObjectStreamClass.java:425)
at java.io.ObjectStreamClass.<init>(ObjectStreamClass.java:413)
at java.io.ObjectStreamClass.lookup0(ObjectStreamClass.java:310)
at java.io.ObjectStreamClass.lookup(ObjectStreamClass.java)
at java.io.ObjectStreamClass.initNonProxy(ObjectStreamClass.java:547)
at java.io.ObjectInputStream.readNonProxyDesc(ObjectInputStream.java:1582)
at java.io.ObjectInputStream.readClassDesc(ObjectInputStream.java:1495)
at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1731)
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1328)
at java.io.ObjectInputStream.readObject(ObjectInputStream.java:350)
at com.tangosol.util.ExternalizableHelper.readSerializable(ExternalizableHelper.java:2216)
at com.tangosol.util.ExternalizableHelper.readObjectInternal(ExternalizableHelper.java:2347)
at com.tangosol.util.ExternalizableHelper.readObject(ExternalizableHelper.java:2290)
at com.tangosol.io.DefaultSerializer.deserialize(DefaultSerializer.java:74)
at com.tangosol.coherence.component.net.extend.Channel.deserialize(Channel.CDB:15)
at com.tangosol.io.pof.PofBufferReader.readAsObject(PofBufferReader.java:3306)
at com.tangosol.io.pof.PofBufferReader.readObject(PofBufferReader.java:2603)
at com.tangosol.coherence.component.net.extend.messageFactory.InvocationServiceFactory$InvocationRequest.readExternal(InvocationServiceFactory.CDB:5)
at com.tangosol.coherence.component.net.extend.Codec.decode(Codec.CDB:29)
at com.tangosol.coherence.component.util.daemon.queueProcessor.service.Peer.decodeMessage(Peer.CDB:25)
at com.tangosol.coherence.component.util.daemon.queueProcessor.service.Peer.onNotify(Peer.CDB:54)
at com.tangosol.coherence.component.util.Daemon.run(Daemon.CDB:42)
at java.lang.Thread.run(Thread.java:662)
I tried multiple combinations like changing, but no success.
<wls:weblogic-web-app xmlns:wls="http://xmlns.oracle.com/weblogic/weblogic-web-app" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://java.sun.com/xml/ns/javaee http://java.sun.com/xml/ns/javaee/web-app_2_5.xsd http://xmlns.oracle.com/weblogic/weblogic-web-app http://xmlns.oracle.com/weblogic/weblogic-web-app/1.0/weblogic-web-app.xsd">
<wls:weblogic-version>10.3.2</wls:weblogic-version>
<wls:context-root>CoherenceWS</wls:context-root>
<wls:container-descriptor>
<wls:prefer-web-inf-classes>true</wls:prefer-web-inf-classes>
</wls:container-descriptor>
</wls:weblogic-web-app>
I have very few classes in WEB-INF/class (trying to load java objects from lib)
My WEB-INF/lib folder contains following jar
coherence.jar
commons-beanutils-1.8.0.jar
commons-collections-3.2.jar
commons-lang-2.4.jar
commons-logging-1.1.1.jar
dsRemoteCall.jar
ezmorph-1.0.jar
jackson-all-1.8.5.jar
json-lib-2.1-jdk13.jar
org.springframework.asm-3.1.0.M2.jar
org.springframework.beans-3.1.0.M2.jar
org.springframework.context-3.1.0.M2.jar
org.springframework.core-3.1.0.M2.jar
org.springframework.expression-3.1.0.M2.jar
org.springframework.oxm-3.1.0.M2.jar
org.springframework.web-3.1.0.M2.jar
org.springframework.web.servlet-3.1.0.M2.jar
schemaclasses.jar
xbean.jar
xom-1.2.7.jar
Am I missing something? Any help is greatly appreciated
Thanks
sunder
Edited by: 868704 on Sep 6, 2011 6:10 PMNote : schemaclasses.jar contains "com/xx/DataServiceRequestDocument" class file.
WebLogic Server 10.3.5.0
Thanks
sunder
Edited by: 868704 on Sep 6, 2011 6:26 PM -
Error while calling XQuery Function with xs:date type as Argument
Hi,
I have follwing function in my DataService .
declare function tns:getXXXDetail($effectiveDate as xs:date,
$cancelDate as xs:date) as element(ns26:XXXAccount)* {
implCode
declare function tns:testGetXXXDetail($searchCriteria as element(ns15:locateMemberXXXDetail))
as element(ns26:XXXAccount)* {
for $Account in tns:getXXXDetail($searchCriteria/ns16:accountTypeDates/ns18:effectiveDate,
$searchCriteria/ns16:accountTypeDates/ns18:cancelDate)
return $Account
I am trying to test the getXXXDetail() function from testGetXXXDetail .
The searchCriteria is a complex type with date elements effectiveDate and cancelDate, both are optional
When i test with effectiveDate,cancelDate elements present in $searchCriteria its working fine.
When i remove these dates element i am getting follwing error
"expected exactly one item, got 0 items" error
Any Clue ?When i remove these dates element i am getting follwing error"expected exactly one item, got 0 items" error
Sounds like your schema for these items indicates minOccurs="1" (or relies on that as the default).
Edit the schema and change the definitions to be...
<xs:element name="effectiveDate" minOccurs="0" ... /> -
Error while executing planning function with reference data
Hi,
I have a two planning functions one is used to upload the file (with out reference data checkbox in planning function RSPLF1) and other planning function ('Referece data'check box is selected in custom planning function RSPLF1) to execute the logic of creating new record along with the flat file data.
Following data is uplooaded
Company code | Profit_ctr | calmonth | Amount
1000 | 50000 | 01.2011 | 150
Cube data
Field1 | Company code | Profit_ctr | calmonth | Amount
| 1000 | 50000 | 01.2011 | 150
Z1 | 1000 | 50000 | 01.2011 | 150
Now I want to change the value from 150 to 200 and when I try to execute with the following data, it is giving dump 'a row with the same key already exists'.
Company code | Profit_ctr | calmonth | Amount
1000 | 50000 | 01.2011 | 200
Ideally in the second execution it should append the new row with Amount value 50 to cube which is the delta value.
I debugged the issue and found that I_TH_REF_DATA has following data and C_TH_DATA also contains the same records.
Field1 Company code | Profit_ctr | calmonth | Amount
# 1000 | 50000 | 01.2011 | 150
Z1 1000 | 50000 | 01.2011 | -150
Z1 1000 | 50000 | 01.2011 | 150
Due to this, record which already exists in C_TH_DATA and trying to append new record with the same combination is failing.
C_TH_DATA should only contain the source data of Amount 200, but not sure why reference data is coming in C_TH_DATA.
Could anyone please guide me on how the reference data is getting populated in C_TH_DATA ?
Thanks in advance
Edited by: peppy on Aug 3, 2011 5:00 PM
Edited by: peppy on Aug 3, 2011 8:37 PMHi Peppy,
C_TH_DATA is hashed table! According to your post you are trying to append to C_TH_DATA and this results in a dump. Please take a look at the standard planning function to see how SAP is programming the planning functions. E.g. in CL_RSPLFC_REPOST method IF_RSPLFA_SRVTYPE_IMP_EXEC~EXECUTE you can find the following code:
CREATE DATA l_r_data_wa LIKE LINE OF c_th_data.
ASSIGN l_r_data_wa->* TO <s_data_wa>.
CREATE DATA l_r_new_wa LIKE LINE OF c_th_data.
ASSIGN l_r_new_wa->* TO <s_new_wa>.
LOOP AT c_th_data INTO <s_data_wa>.
<s_new_wa> = <s_data_wa>.
now the SAP code changes the values, you can do it your way here
and than write the changes back
MODIFY TABLE c_th_data FROM <s_data_wa>.
ENDLOOP:
Another option is to use the READ statement to check if the record is already in the table. If not, you can use MODIFY otherwise you use INSERT. So you get something like this:
READ C_TH_DATA from <s_data_wa> transporting no fields.
if not sy-subrc EQ 0.
INSERT <s_data_wa> into table C_TH_DATA.
else.
MODIFY TABLE c_th_data FROM <s_data_wa>.
endif.
Depending on your requirements you can also use the collect statement.
If c_th_data shows the reference data as well, you may need to adjust the filter to restrict it to the correct values.
Hope this helps.
Best regards
Matthias Nutt
SAP Consulting Switzerland -
ORA-31061 error while creating XMLType table with virtual column
I'm not calling it frustration ;)
but still... what about this one :
SQL> select * from v$version;
BANNER
Oracle Database 11g Express Edition Release 11.2.0.2.0 - Production
PL/SQL Release 11.2.0.2.0 - Production
CORE 11.2.0.2.0 Production
TNS for 32-bit Windows: Version 11.2.0.2.0 - Production
NLSRTL Version 11.2.0.2.0 - Production
SQL> create table test_virtual of xmltype
2 xmltype store as binary xml
3 virtual columns (
4 doc_id as (
5 xmlcast(
6 xmlquery('/root/@id'
7 passing object_value returning content)
8 as number
9 )
10 )
11 )
12 ;
Table created.Now, on the latest version :
SQL> select * from v$version;
BANNER
Oracle Database 11g Enterprise Edition Release 11.2.0.3.0 - Production
PL/SQL Release 11.2.0.3.0 - Production
CORE 11.2.0.3.0 Production
TNS for 32-bit Windows: Version 11.2.0.3.0 - Production
NLSRTL Version 11.2.0.3.0 - Production
SQL> create table test_virtual of xmltype
2 xmltype store as binary xml
3 virtual columns (
4 doc_id as (
5 xmlcast(
6 xmlquery('/root/@id'
7 passing object_value returning content)
8 as number
9 )
10 )
11 )
12 ;
passing object_value returning content)
ERROR at line 7:
ORA-00604: error occurred at recursive SQL level 1
ORA-31061: XDB error: dbms_xdbutil_int.get_tablespace_tab
ORA-06512: at "XDB.DBMS_XDBUTIL_INT", line 1002Is there something I should be aware of?
Right now, I'm just evaluating the version so I can't submit any SR.
Thanks for anyone trying to reproduce the issue.Just tested again on a new installation (64-bit server).
It works :
SQL> select * from v$version;
BANNER
Oracle Database 11g Enterprise Edition Release 11.2.0.3.0 - 64bit Production
PL/SQL Release 11.2.0.3.0 - Production
CORE 11.2.0.3.0 Production
TNS for 64-bit Windows: Version 11.2.0.3.0 - Production
NLSRTL Version 11.2.0.3.0 - Production
SQL>
SQL> create table test_virtual of xmltype
2 xmltype store as binary xml
3 virtual columns (
4 doc_id as (
5 xmlcast(
6 xmlquery('/root/@id'
7 passing object_value returning content)
8 as number
9 )
10 )
11 );
Table created
Now I'll try to see what are the differences between the two installations.
Thanks Dan and Marco for looking into this.
Edited by: odie_63 on 2 mai 2012 15:51 -
Error while accessing table from procedure but no error from anonymous plsq
Hi All,
I am getting a strange error while accessing a table from a different schema.
In that concerned schema OWBSYS, i executed the following:
grant Select on wb_rt_audit to ods;In Ods schema i executed:
CREATE OR REPLACE SYNONYM wb_rt_audit FOR OWBSYS.wb_rt_audit;In ODS schema, when i execute:
create or replace
procedure pp_test as
lv_owb_reject number := 0;
lv_filename_1 varchar2(200):= 'asda';
begin
SELECT MAX(aud.rta_iid) into lv_owb_reject
FROM wb_rt_audit aud
WHERE aud.rta_lob_name LIKE Upper(lv_filename_1)
end;
/I get the error:
Warning: execution completed with warning
procedure Compiled.
ORA-00942 - TABLE OR VIEW DOES NOT EXISTHowever, when i execute as an anonymous plsql the same code:
declare
lv_owb_reject number := 0;
lv_filename_1 varchar2(200):= 'asda';
begin
SELECT MAX(aud.rta_iid) lv_owb_reject
FROM wb_rt_audit aud
WHERE aud.rta_lob_name LIKE Upper(lv_filename_1)
end;
anonymous block completedthere is no issue.
Can someone help me understand what I might be missing:
Edited by: Chaitanya on Feb 28, 2012 12:31 AMCheck if have some other steps.
SQL>conn scott1/tiger
Connected.
SQL>create table wb_rt_audit (rta_iid number);
Table created.
SQL>insert into wb_rt_audit values (100);
1 row created.
SQL>insert into wb_rt_audit values (200);
1 row created.
SQL>commit;
Commit complete.
SQL>grant select on wb_rt_audit to scott2;
Grant succeeded.
SQL>conn scott2/tiger
Connected.
SQL>create synonym wb_rt_audit for scott1.wb_rt_audit;
Synonym created.
SQL>create or replace procedure pp_test as
l_number number(10);
begin
SELECT MAX(rta_iid) into l_number
FROM wb_rt_audit;
end pp_test;
Procedure created. -
Buffer Table not up to date error while accessing Catalogs
Hello,
I am getting Buffer table not up-to-date error while accessing ARIBA Procurement Catalogs.
Initially, my Test system was pointing to Production Catalogs but now, when I changed the settings in "Define External Web Services" and pointed it to DEV Catalogs, it is failing with error: "Buffer Table not up-to-Date"
Refreshed the Buffer already but still error not fixed.
We already have the similar settings in other systems and there the connections are working fine
Please advise
Regards
Manish AgrawalHi Manish,
Could you please check the below notes and implement the same since I had the same issue upon punching out from the catalog to SRM system.
2041631 - Simplified Shopping Cart:Dump on Check Out From Catalog
2086844 - Dump "Buffer table not up-to-date" on Check Out from Catalog
In addition to the above, please maintain the portal information in the path as shown in the below screen shot
Kindly check and let me know on how it goes.
Best Regards,
Bharathi -
Getting Error while accessing the data using odata service
Hi All,
Iam new to SAP FIORI.
Iam getting the below error while accessing the data using odata service.
"Failed to load resource: the server responded with a status of 404 (Not found)"
"No 'Access-Control-Allow-Origin' header is present on the requested resource. Origin "
i have tried all the solutions like changing the url pattern "proxy/htttp".
and disabled - security in chrome (Chrome is Updated version).
i tried with IE still got the same problem.
And installed all the required software in eclipse
While installing GWPA plugin i got the following error.
let me know if any one have idea.
Thanks in advance.> Do you want to add and/or update the data in the already existing tables or do you want to replace the content completely?
>
> so in that way :
> bot the options are fine what ever take less time.
Sorry mate, but YOU have to know what you want here.
I gave you an easy to follow set of steps.
As you don't seam to mind the outcome, just might just use them...
> I wanted to know weathe i can use the loadercli for thie export import or not? if yes then is there any new steps to do before i do the export import?
We had this discussion before...
>
> For that the easiest option would be just to drop the tables of SAPR3 and run the import again.
>
> For ease of use you could also just do:
> - logon as superdba
> - drop user SAPR3
> - create user SAPR3 password SOMEPW not exclusive dba
>
> After these steps you can easily pump the data into the database again.
>
> So here in th above given steps , i am creating a new SAPR3 user and why it is not exclusive dba ?
> i already have that user SAPR3 can i use the same.
Yes, you do have the SAPR3 user.
But you don't seem to like to read documentation or learn about how the tools work or anything like that.
Therefore I gave you s simple way to reach your goal.
Of course it's possible to reuse the user.
But then you would have to deal with already existing tables, already existing data etc.
You don't seem to be able to do that. So, the easy steps might be better suited for your needs.
regards,
Lars -
Loading issue : Error: sql error in the database while accessing a table
Hello,
where as one of the DTP in the process chain failed due to *Error: sql error in the database while accessing a table*, where as in the short dump it showing as Transaction log of data base is full., but i checked the data base space in DB02, more space is available . once we run the same DTP by manually its successful. its not through any errors.
could u please help me out solve the problem.
Thanks
siva kumar.it might be a lock. do you drop index before loading?
the database might be full at the moment of loading and not later if many loadings happen at the same time on the same system...
when you then rerun your dtp manually, it can go through as it's perhaps the only one running at that moment...
you can try to set the btch parameter to 1...this will help in some cases.
M. -
Problem Summary
DG4ODBC: STRING DATA, RIGHT TRUNCATION ERROR WHILE ACCESSING VARCHAR(MAX) COLUMN
Driver
Microsoft® ODBC Driver 11 for SQL Server® - RedHat Linux
Problem Description
When selecting a MS SQL VARCHAR (max) column over a ODBC Gateway database connection I am getting this error from Oracle:
ORA-28500: connection from ORACLE to a non-Oracle system returned this message:
[Microsoft][ODBC Driver 11 for SQL Server]String data, right truncation {01004}
[Microsoft][ODBC Driver 11 for SQL Server]String data, right truncation {01004}
[Microsoft][ODBC Driver 11 for SQL Server]String data, right truncation {01004}
[Microsoft][ODBC Driver 11 for SQL Server]String data, right truncation {01004}
[Microsoft][ODBC Driver 11 for SQL Server]String data, right truncation {01004}
ORA-02063: preceding 2 lines from <LINK_NAME>
The ODBC driver should map the varchar(max) column to SQL_LONGVARCHAR which would be appropriate for Oracle but the column is getting truncated
Issue
By default the SQL Server ODBC driver exposes the varchar(max) data type as a SQL_VARCHAR. When reporting the maximum size of a varchar(max) column, the driver returns 0, which is the Microsoft convention for "unlimited".
[ODBC][25518][1399527750.588980][SQLDescribeCol.c][497]
Exit:[SQL_SUCCESS]
Column Name = [raw_response]
Data Type = 0x7fffe3cbe1a4 -> 12
Column Size = 0x7fffe3cbe158 -> 0
Decimal Digits = 0x7fffe3cbe1ac -> 0
Nullable = 0x7fffe3cbe1b0 -> 1
DG4ODBC is unable to interpret a zero length as an "unlimited" size and returns an error when retrieving varchar(max) data.
FreeTDS and DataDirect ODBC drivers return SQL_LONGVARCHAR instead of SQL_VARCHAR with 0 precision. So there is no problem reported for these drivers.
Is there a fix for this or is the Microsoft ODBC driver team working on a fix for the driver regarding varchar(max)?
Regards,
JamesHi James,
Thank you for your question. I am trying to involve someone more familiar with this topic for a further look at this issue. Sometime delay might be expected from the job transferring. Your patience is greatly appreciated.
If you have any feedback on our support, please click
here.
Regards,
Elvis Long
TechNet Community Support -
System error while accessing characteristic data
Hi Experts,
While copying the Settings for process management from plant 1 to plant 2, for the step "process instruction categories", I am facing an error as below:
System error while accessing characteristic data
Message no. 0C002
Diagnosis
An internal error occurred while accessing the characteristic data. The characteristic cannot be used.
Procedure
Contact your system administrator.
I am using the transaction O20C.
Can some one please through some light on this?
PraveenHi AP,
Thanks for the reply.
I have already checked that note, I didnt find any solution from that.
And I am on ECC 6.0 EHP5.
Praveen
Maybe you are looking for
-
I had a iPhone 3G and I change it in to a iPhone 4G 32GB! I have 3 apple ID account. The one I used is <edited by host> and I know my password but it ask me for my security question and I forgot it! It send a veify email the answer the an email to <e
-
Why can't I print to the desktop PDF printer from a PDF file?
I don't know how much more info I can provide. Sometimes I want to print one page of a multi-page .PDF document (to the desktop pdf printer) and it looks like it goes, but never appears on the create adobe website.
-
Well to start off I'm a loyal Sony user. And I recently purchased the z1 after falling in love with it at first sight. Before this I had the arc s which I'd bought as soon as it was released in my city and used it till I lost it a month ago. I'm real
-
Import data from excel 2013 workbook issue,is it a bug ?
When I use powerquery to import data from excel 2013 workbook An unexpected error occurred.Other editions no problem.
-
What must I do? My ipad has stoped receiving emails since l updated to iOS 8, plz help
MMy iPad mini has stoped receiving email.