Batch inconsistency error & loss of data
Hi All,
Almost all of our SFG and FG materials are batch managed. Suddenly we are facing batch inconsistency error which was not there earlier.
When production order confirmation is being done, batches are created and batch characteristics results are entered. But in many cases these batch values are not being updated in MCH1 table and we are facing batch inconsistency error. For example, during UD in Quality, batch inconsistency error is not allowing to pass the lot.
We are rectifying this using BMCC, but the major issue is that after running the transaction all batch characteristics data are lost. So we have to re-enter the results again.
The frequency of this error is too high now, almost 4/5 times a day. As we are approaching Go-Live very very soon, this has become a critical issue.
Please help with suggestions to rectify the problem.
Thanks in advance!
I usually find inconsistencies to be related to specific characteristics, usually those with procedures or functions attached used to infer values or check values. Are you using characteristics in this way at all on the batch?
Is this a brand new system or are you going live with a new business into an existing SAP system? Is there any chance any of the internal tables used by the batch data have been corrupted somehow? The batch characteristic tables are more complicated than most, using a lot of internal pointers to link tables together. If a conversion program or upload program mucked around directly in any of these tables, that could have caused significant problems in your tables.
I'd look to see first if I could find a commonality among the problem batches. I.e a similar characteristic, or same group of classes involved. Maybe the materials were created and classified on the the same day? Something that would indicate the problem. Computer bugs are never random. They just often appear that way!!!
FF
Similar Messages
-
Good day,
I am currently working on a SSIS package that had been built by a previous user. That Package allows me to import data from an excel source to a SQL table. The excel sheet holds multiple columns, However I am having difficulty with one
column when running the package.
The column rows hold a character and an integer (eg: N2). The data types are as follows:
SQL – varchar(3),NULL
SSIS - double-precision float [DT_R8]
I then inserted a data conversion task to change the data type. The task holds the following properties:
Input column – F6
Output Alias – Copy of F6
Data Type – string[DT_STR]
Length – 50
When I execute the package I receive the following error:”
[Excel Source Data [76]] Error: There was an error with output column "F6" (91) on output "Excel Source Output" (85). The column status returned was: "The value could not be converted because
of a potential loss of data.".
I do know that usually the message "The value could not be converted because of a potential loss of data." refers to a data type or length problem.
Further insight to this problem will be much appreciated.Hi Zahid01,
From the error message you post, we can infer the issue is caused by the input column “F6” cannot convert to the output column in Excel Source.
Based on my test, I can reproduce the issue in my environment. The exactly cause is the data type of input column “F6” cannot convert to the data type of corresponding output column. For example, the data type of input column “F6” is [DT_STR], the data type
of corresponding output column is [DT_I4], and there are some nonnumeric values in the input column “F6”. When we execute the task, the error occurs. We can verify the issue in the Advanced Editor dialog box of Excel Source. The following screenshot is for
your reference:
To avoid this issue, please make sure the input column can convert to the output column in Excel Source, including all the values in the input column.
If there are any other questions, please feel free to ask.
Thanks,
Katherine Xiong
Katherine Xiong
TechNet Community Support -
hi,
Here i am giving the DDLs and Inserts of Database .i am getting error while fetching data into outlist.
and the error is err_msg = ORA-00932: inconsistent datatypes: expected - got -
I also posted the same one in my previous post but i did not got any satisfactory answer .Please help me out .It's urgent ..
CREATE TABLE ip_lov_hdr
(table_id VARCHAR2(50) NOT NULL,
table_name VARCHAR2(30) NOT NULL,
col_name VARCHAR2(30) NOT NULL,
codetype VARCHAR2(2))
PCTFREE 10
INITRANS 1
MAXTRANS 255
ALTER TABLE ip_lov_hdr
ADD CONSTRAINT pk_lov_hdr PRIMARY KEY (table_id)
USING INDEX
PCTFREE 10
INITRANS 2
MAXTRANS 255
CREATE TABLE ip_lov_dtl
(table_id VARCHAR2(50) NOT NULL,
col_name VARCHAR2(30) NOT NULL)
PCTFREE 10
INITRANS 1
MAXTRANS 255
ALTER TABLE ip_lov_dtl
ADD CONSTRAINT pk_lov_dtl PRIMARY KEY (table_id, col_name)
USING INDEX
PCTFREE 10
INITRANS 2
MAXTRANS 255
ALTER TABLE ip_lov_dtl
ADD CONSTRAINT fk_lov_hdr FOREIGN KEY (table_id)
REFERENCES ip_lov_hdr (table_id) ON DELETE SET NULL
CREATE TABLE emp
(ename VARCHAR2(50),
empno VARCHAR2(9),
dept VARCHAR2(4))
PCTFREE 10
INITRANS 1
MAXTRANS 255
CREATE OR REPLACE
TYPE out_rec_lov AS OBJECT(ename VARCHAR2(50),empno VARCHAR2(9))
INSERT INTO emp
(ENAME,EMPNO,DEPT)
VALUES
('titu','111','10')
INSERT INTO emp
(ENAME,EMPNO,DEPT)
VALUES
('naria','222',NULL)
INSERT INTO emp
(ENAME,EMPNO,DEPT)
VALUES
('tiks','123','55')
INSERT INTO emp
(ENAME,EMPNO,DEPT)
VALUES
('tiki','221',NULL)
INSERT INTO emp
(ENAME,EMPNO,DEPT)
VALUES
('narayan',NULL,NULL)
INSERT INTO ip_lov_hdr
(TABLE_ID,TABLE_NAME,COL_NAME,CODETYPE)
VALUES
('emp_id','emp','ename',NULL)
INSERT INTO ip_lov_dtl
(TABLE_ID,COL_NAME)
VALUES
('emp_id','empno')
create or replace PACKAGE PKG_LOV_1
AS
TYPE listtable IS TABLE OF out_rec_lov INDEX BY BINARY_INTEGER;
PROCEDURE p_getlov (
tab_id IN VARCHAR2,
col_value IN VARCHAR2,
OUTLIST OUT LISTTABLE,
err_msg OUT VARCHAR2);
END;
create or replace PACKAGE BODY PKG_LOV_1 AS
PROCEDURE p_getlov( tab_id IN VARCHAR2,
col_value IN VARCHAR2,
outlist OUT listtable,
err_msg OUT VARCHAR2)
IS
query_str VARCHAR2(2000);
col_str VARCHAR2(2000);
type cur_typ IS ref CURSOR;
c cur_typ;
i NUMBER := 0;
l_table_name ip_lov_hdr.TABLE_NAME %TYPE;
l_col_name ip_lov_hdr.col_name%TYPE;
l_codetype ip_lov_hdr.codetype%TYPE;
BEGIN
BEGIN
SELECT TABLE_NAME,
codetype,
col_name
INTO l_table_name,
l_codetype,
l_col_name
FROM ip_lov_hdr
WHERE UPPER(table_id) = UPPER(tab_id);
EXCEPTION
WHEN no_data_found THEN
NULL;
END;
col_str := l_col_name;
FOR rec IN
(SELECT col_name
FROM ip_lov_dtl
WHERE table_id = tab_id)
LOOP
col_str := col_str || ',' || rec.col_name;
END LOOP;
DBMS_OUTPUT.PUT_LINE (col_str);
IF l_codetype IS NULL THEN
query_str := 'SELECT ' || col_str || ' FROM ' || l_table_name || ' WHERE ' || l_col_name || ' like :col_value';
ELSE
query_str := 'SELECT ' || col_str || ' FROM ' || l_table_name || ' WHERE Codetype =' || l_codetype || ' And ' || l_col_name || ' like :col_value';
END IF;
DBMS_OUTPUT.PUT_LINE(QUERY_STR);
BEGIN
OPEN c FOR query_str USING col_value;
LOOP
FETCH c INTO outlist(i);
i := i + 1;
EXIT WHEN c % NOTFOUND;
END LOOP;
CLOSE c;
EXCEPTION
WHEN others THEN
err_msg := SUBSTR(sqlerrm, 1, 500);
END;
EXCEPTION
WHEN others THEN
err_msg := SUBSTR(sqlerrm, 1, 500);
END p_getlov;
END PKG_LOV_1;
Regards,
Dhabas
Edited by: Dhabas on Dec 29, 2008 12:58 PM
Edited by: Dhabas on Dec 29, 2008 2:43 PM
Edited by: Dhabas on Dec 29, 2008 2:54 PMI made one more small change. This code
15 i number := 0;I modified as
15 i number := 1;And It works fine for me
SQL> create table ip_lov_hdr
2 (
3 table_id varchar2(50) not null,
4 table_name varchar2(30) not null,
5 col_name varchar2(30) not null,
6 codetype varchar2(2)
7 )
8 /
Table created.
SQL> alter table ip_lov_hdr add constraint pk_lov_hdr primary key (table_id) using index
2 /
Table altered.
SQL> create table ip_lov_dtl
2 (
3 table_id varchar2(50) not null,
4 col_name varchar2(30) not null
5 )
6 /
Table created.
SQL> alter table ip_lov_dtl add constraint pk_lov_dtl primary key (table_id, col_name) using index
2 /
Table altered.
SQL> alter table ip_lov_dtl add constraint fk_lov_hdr foreign key (table_id) references ip_lov_hdr (table_id) on delete set null
2 /
Table altered.
SQL> create table emp1
2 (
3 ename varchar2(50),
4 emp1no varchar2(9),
5 dept varchar2(4)
6 )
7 /
Table created.
SQL> create or replace type out_rec_lov as object(ename varchar2(50),emp1no varchar2(9))
2 /
Type created.
SQL> insert into emp1
2 (ename,emp1no,dept)
3 values
4 ('titu','111','10')
5 /
1 row created.
SQL> insert into emp1
2 (ename,emp1no,dept)
3 values
4 ('naria','222',null)
5 /
1 row created.
SQL> insert into emp1
2 (ename,emp1no,dept)
3 values
4 ('tiks','123','55')
5 /
1 row created.
SQL> insert into emp1
2 (ename,emp1no,dept)
3 values
4 ('tiki','221',null)
5 /
1 row created.
SQL> insert into emp1
2 (ename,emp1no,dept)
3 values
4 ('narayan',null,null)
5 /
1 row created.
SQL> insert into ip_lov_hdr
2 (table_id,table_name,col_name,codetype)
3 values
4 ('emp1_id','emp1','ename',null)
5 /
1 row created.
SQL> insert into ip_lov_dtl
2 (table_id,col_name)
3 values
4 ('emp1_id','emp1no')
5 /
1 row created.
SQL> commit
2 /
Commit complete.
SQL> create or replace package pkg_lov_1
2 as
3 type listtable is table of out_rec_lov index by binary_integer;
4
5 procedure p_getlov
6 (
7 tab_id in varchar2,
8 col_value in varchar2,
9 outlist out listtable,
10 err_msg out varchar2
11 );
12 end;
13 /
Package created.
SQL> create or replace package body pkg_lov_1
2 as
3 procedure p_getlov
4 (
5 tab_id in varchar2,
6 col_value in varchar2,
7 outlist out listtable,
8 err_msg out varchar2
9 )
10 is
11 query_str varchar2(2000);
12 col_str varchar2(2000);
13 type cur_typ is ref cursor;
14 c cur_typ;
15 i number := 1;
16 l_table_name ip_lov_hdr.table_name %type;
17 l_col_name ip_lov_hdr.col_name%type;
18 l_codetype ip_lov_hdr.codetype%type;
19 begin
20 begin
21 select table_name,
22 codetype,
23 col_name
24 into l_table_name,
25 l_codetype,
26 l_col_name
27 from ip_lov_hdr
28 where upper(table_id) = upper(tab_id);
29
30 exception
31 when no_data_found then
32 null;
33 end;
34
35 col_str := l_col_name;
36
37 for rec in (select col_name
38 from ip_lov_dtl
39 where table_id = tab_id)
40 loop
41 col_str := col_str || ',' || rec.col_name;
42 end loop;
43
44 dbms_output.put_line (col_str);
45
46 if l_codetype is null
47 then
48 query_str := 'select out_rec_lov(' || col_str || ') from ' || l_table_name || ' where ' || l_col_name || ' like :col_value';
49 else
50 query_str := 'select out_rec_lov(' || col_str || ') from ' || l_table_name || ' where codetype =' || l_codetype || ' and ' || l_col_name || ' like :col_value';
51 end if;
52
53 dbms_output.put_line(query_str);
54
55 begin
56 open c for query_str using col_value;
57
58 loop
59 fetch c into outlist(i);
60 i := i + 1;
61 exit when c % notfound;
62 end loop;
63
64 close c;
65 exception
66 when others then
67 err_msg := substr(sqlerrm, 1, 500);
68 end;
69
70 exception
71 when others then
72 err_msg := substr(sqlerrm, 1, 500);
73
74 end p_getlov;
75 end pkg_lov_1;
76 /
Package body created.
SQL> declare
2 outlist pkg_lov_1.listtable;
3 err_msg varchar2(1000);
4 begin
5 pkg_lov_1.p_getlov('emp1_id', 'titu', outlist, err_msg);
6 for i in 1..outlist.count
7 loop
8 dbms_output.put_line(outlist(i).ename ||','||outlist(i).emp1no);
9 end loop;
10
11 dbms_output.put_line(err_msg);
12 end;
13 /
ename,emp1no
select out_rec_lov(ename,emp1no) from emp1 where ename like :col_value
titu,111
PL/SQL procedure successfully completed.
SQL> -
Error While extracting data from FIAA and FIAP
Hi Gurus,
I am facing error while extracting data from R/3 Source. 0FIAA, 0FIAP datasources. I am getting the same error repeatedly. No data will come BW. When I checked in RSA3 I can extract records.
Ther Error is as follows:
Request still running
Diagnosis
No errors could be found. The current process has probably not finished yet.
System response
The ALE inbox of the SAP BW is identical to the ALE outbox of the source system
and/or
the maximum wait time for this request has not yet run out
and/or
the batch job in the source system has not yet ended.
Current status
No Idocs arrived from the source system.
Kindly help. Your points are assured.
Thanks and Regards
PrasadHello Prasad,
Have you already checked what happened in the source system ?
You should verify if a job is running in sm37, or if there is no dump runtime errors due to the extraction in st22.
It could be a clue of what happened in R/3.
Let us know,
Regards,
Mickael -
Hi Folks,
In System logs, daily I am encountering the message as "Error in transaction data".
The messages are listed in SM21 in sequence as -
Error processing batch input session PMFMASSIGN
> Queue ID: 07011706105177351507
> Transaction no. 1, block no. 1
> Error in transaction data
On more detailed analysis, after drilling further through each message I found the following message relevant -
Documentation for system log message D2 0 :
Entry relating to an error during the processing of a batch input session. In the batch input session the data for the next transaction is requested. However, the data read is no transaction data. The session has probably been destroyed in the database.
Please explain as why the issue is occuring. I remember some time back, I executed RSBTCDEL2 on Production to clear invalid batch sessions but had to terminate due to long run.
Kindly help me fix this.
Regards,
NickOne interesting discovery I just found in R/3, was this job log with respect to the above process chain:
it says that the job was cancelled in R/3 because the material ledger currencies were changed.
the process chain is for inventory management and the data load process that get cancelled are for the job gets cancelled in the source system:
1. Material Valuation: period ending inventories
2. Material Valuation: prices
The performance assistant says this but I am not sure how far can I work on the R/3 side to rectify this:
Material ledger currencies were changed
Diagnosis
The currencies currently set for the material ledger and the currency types set for valuation area 6205 differ from those set at conversion of the data (production startup).
System Response
The system does not allow you to post transactions after changing the currency settings to ensure consistency.
Procedure
Replace the current settings with the those entered at production
start-up.
If you wish to change the currency settings, you must use programs to convert data from the old to the new currencies.
Inform your system administrator
Anyone knowledgable in this area please give your inputs.
- DB -
Hello,
EBS version : 11.5.10.2
DB version : 11.2.0.3
OS version : AIX 6.1
As a part of 11.5.10.2 to R12.1.1 upgrade, while applying merged 12.1.1 upgrade driver(u6678700.drv), we got below error :
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
ATTENTION: All workers either have failed or are waiting:
FAILED: file glsupdas.ldt on worker 3.
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
drix10:/fmstop/r12apps/apps/apps_st/appl/admin/FMSTEST/log>tail -20 adwork003.log
Restarting job that failed and was fixed.
Time when worker restarted job: Wed Aug 07 2013 10:36:14
Loading data using FNDLOAD function.
FNDLOAD APPS/***** 0 Y UPLOAD @SQLGL:patch/115/import/glnlsdas.lct @SQLGL:patch/115/import/US/glsupdas.ldt -
Connecting to APPS......Connected successfully.
Calling FNDLOAD function.
Returned from FNDLOAD function.
Log file: /fmstop/r12apps/apps/apps_st/appl/admin/FMSTEST/log/US_glsupdas_ldt.log
Error calling FNDLOAD function.
Time when worker failed: Wed Aug 07 2013 10:36:14
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
drix10:/fmstop/r12apps/apps/apps_st/appl/admin/FMSTEST/log>tail -20 US_glsupdas_ldt.log
Current system time is Wed Aug 7 10:36:14 2013
Uploading from the data file /fmstop/r12apps/apps/apps_st/appl/gl/12.0.0/patch/115/import/US/glsupdas.ldt
Altering database NLS_LANGUAGE environment to AMERICAN
Dumping from LCT/LDT files (/fmstop/r12apps/apps/apps_st/appl/gl/12.0.0/patch/115/import/glnlsdas.lct(120.0), /fmstop/r12apps/apps/apps_st/appl/gl/12.0.0/patch/115/import/US/glsupdas.ldt) to staging tables
Dumping LCT file /fmstop/r12apps/apps/apps_st/appl/gl/12.0.0/patch/115/import/glnlsdas.lct(120.0) into FND_SEED_STAGE_CONFIG
Dumping LDT file /fmstop/r12apps/apps/apps_st/appl/gl/12.0.0/patch/115/import/US/glsupdas.ldt into FND_SEED_STAGE_ENTITY
Dumped the batch (GL_DEFAS_ACCESS_SETS SUPER_USER_DEFAS , GL_DEFAS_ACCESS_SETS SUPER_USER_DEFAS ) into FND_SEED_STAGE_ENTITY
Uploading from staging tables
Error loading seed data for GL_DEFAS_ACCESS_SETS: DEFINITION_ACCESS_SET = SUPER_USER_DEFAS, ORA-06508: PL/SQL: could not find program unit being called
Concurrent request completed
Current system time is Wed Aug 7 10:36:14 2013
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++Below is info about file versions and INVALID packages related to GL.
PACKAGE BODY GL_DEFAS_ACCESS_SETS_PKG is invalid with error component 'GL_DEFAS_DBNAME_S' must be declared.
I can see GL_DEFAS_DBNAME_S is a VALID sequence accessible by apps user with or without specifying GL as owner.
SQL> select text from dba_source where name in ('GL_DEFAS_ACCESS_DETAILS_PKG','GL_DEFAS_ACCESS_SETS_PKG') and line=2;
TEXT
/* $Header: glistdds.pls 120.4 2005/05/05 01:23:16 kvora ship $ */
/* $Header: glistddb.pls 120.16 2006/04/10 21:28:48 cma ship $ */
/* $Header: glistdas.pls 120.4 2005/05/05 01:23:02 kvora ship $ */
/* $Header: glistdab.pls 120.5 2006/03/13 19:56:21 cma ship $ */
SQL> select * from all_objects where object_name in ('GL_DEFAS_ACCESS_DETAILS_PKG','GL_DEFAS_ACCESS_SETS_PKG')
2 ; OWNER OBJECT_NAME SUBOBJECT_NAM OBJECT_ID DATA_OBJECT_ID OBJECT_TYPE CREATED LAST_DDL_TIME TIMESTAMP STATUS T G S NAMESPACE
EDITION_NAME
APPS GL_DEFAS_ACCESS_DETAILS_PKG 1118545 PACKAGE 05-AUG-13 05-AUG-13 2013-08-05:18:54:51 VALID N N N 1
APPS GL_DEFAS_ACCESS_SETS_PKG 1118548 PACKAGE 05-AUG-13 06-AUG-13 2013-08-05:18:54:51 VALID N N N 1
APPS GL_DEFAS_ACCESS_SETS_PKG 1128507 PACKAGE BODY 05-AUG-13 06-AUG-13 2013-08-06:12:56:50 INVALID N N N 2
APPS GL_DEFAS_ACCESS_DETAILS_PKG 1128508 PACKAGE BODY 05-AUG-13 05-AUG-13 2013-08-05:19:43:51 VALID N N N 2
SQL> select * from all_objects where object_name='GL_DEFAS_DBNAME_S'; OWNER OBJECT_NAME SUBOBJECT_NAM OBJECT_ID DATA_OBJECT_ID OBJECT_TYPE CREATED LAST_DDL_TIME TIMESTAMP STATUS T G S NAMESPACE
EDITION_NAME
GL GL_DEFAS_DBNAME_S 1087285 SEQUENCE 05-AUG-13 05-AUG-13 2013-08-05:17:34:43 VALIDN N N 1
APPS GL_DEFAS_DBNAME_S 1087299 SYNONYM 05-AUG-13 05-AUG-13 2013-08-05:17:34:43 VALIDN N N 1
SQL> conn apps/apps
Connected.
SQL> SELECT OWNER, OBJECT_NAME, OBJECT_TYPE, STATUS
FROM DBA_OBJECTS
WHERE OBJECT_NAME = 'GL_DEFAS_ACCESS_SETS_PKG'; 2 3 OWNER OBJECT_NAME OBJECT_TYPE STATUS
APPS GL_DEFAS_ACCESS_SETS_PKG PACKAGE VALID
APPS GL_DEFAS_ACCESS_SETS_PKG PACKAGE BODY INVALID SQL> ALTER PACKAGE GL_DEFAS_ACCESS_SETS_PKG COMPILE; Warning: Package altered with compilation errors. SQL> show error
No errors.
SQL> ALTER PACKAGE GL_DEFAS_ACCESS_SETS_PKG COMPILE BODY; Warning: Package Body altered with compilation errors. SQL> show error
Errors for PACKAGE BODY GL_DEFAS_ACCESS_SETS_PKG: LINE/COL ERROR
39/17 PLS-00302: component 'GL_DEFAS_DBNAME_S' must be declared
drix10:/fmstop/r12apps/apps/apps_st/appl/gl/12.0.0/patch/115/odf>cat $GL_TOP/patch/115/sql/glistdab.pls|grep -n GL_DEFAS_DBNAME_S
68: SELECT GL.GL_DEFAS_DBNAME_S.NEXTVAL
81: fnd_message.set_token('SEQUENCE', 'GL_DEFAS_DBNAME_S');
SQL> show user
USER is "APPS"
SQL> SELECT GL.GL_DEFAS_DBNAME_S.NEXTVAL
FROM dual; 2 -- with GL.
NEXTVAL
1002
SQL> SELECT GL_DEFAS_DBNAME_S.NEXTVAL from dual; --without GL. or using synonym.
NEXTVAL
1003
drix10:/fmstop/r12apps/apps/apps_st/appl/gl/12.0.0/patch/115/odf>strings -a $GL_TOP/patch/115/sql/glistdab.pls|grep '$Header'
REM | $Header: glistdab.pls 120.5 2006/03/13 19:56:21 cma ship $ |
/* $Header: glistdab.pls 120.5 2006/03/13 19:56:21 cma ship $ */
drix10:/fmstop/r12apps/apps/apps_st/appl/gl/12.0.0/patch/115/odf>strings -a $GL_TOP/patch/115/sql/glistdas.pls |grep '$Header'
REM | $Header: glistdas.pls 120.4 2005/05/05 01:23:02 kvora ship $ |
/* $Header: glistdas.pls 120.4 2005/05/05 01:23:02 kvora ship $ */ -
Hello Friends,
I am using RFBIBL00 to create GL postings. I have used this program successfully before and never had the following issue:
RFBIBL00 creates a batch and when the batch is processed I get the following error: Field COBL-PRODPER does not exist in the screen SAPLKACB 0002. I my test, I was trying to create a basic document just with one simple debit and credit entry. Could this be a config issue? Any input/help will be greatly appreciated.
Thanks,
Samwell, it worked. i was instantiating the batch input structures but somehow some fields including prodper were not affected (maybe because they were extended fields?).
anyways, rob - your post made me to think that prodper is going into the batch input process without / (no-data character) which was correct!
thanks! -
In which tables are batch input error messages logged?
does anybody know in which table are the batch input error messages logged?
I have to display the error messages which have occured during the transaction
I tried to find out, but i could see only the table BDCMSGCOLL, this table has only the Batch input message number and not the message itself?
does anybody know about the batch input error messages?hi,
check the sample code below.
messtab is of type BDCMSGCOLL.
Call the transaction and then use T100 to get the message.
DATA: l_mstring(480).
call transaction using
REFRESH messtab.
CALL TRANSACTION tcode USING bdcdata
MODE ctumode
UPDATE cupdate
MESSAGES INTO messtab.
l_subrc = sy-subrc.
SKIP.
LOOP AT messtab.
SELECT SINGLE * FROM t100 WHERE sprsl = messtab-msgspra
AND arbgb = messtab-msgid
AND msgnr = messtab-msgnr.
IF sy-subrc = 0.
l_mstring = t100-text.
IF l_mstring CS '&1'.
REPLACE '&1' WITH messtab-msgv1 INTO l_mstring.
REPLACE '&2' WITH messtab-msgv2 INTO l_mstring.
REPLACE '&3' WITH messtab-msgv3 INTO l_mstring.
REPLACE '&4' WITH messtab-msgv4 INTO l_mstring.
ELSE.
REPLACE '&' WITH messtab-msgv1 INTO l_mstring.
REPLACE '&' WITH messtab-msgv2 INTO l_mstring.
REPLACE '&' WITH messtab-msgv3 INTO l_mstring.
REPLACE '&' WITH messtab-msgv4 INTO l_mstring.
ENDIF.
CONDENSE l_mstring.
WRITE: /4 messtab-msgtyp, l_mstring(250).
ELSE.
WRITE: /4 messtab.
ENDIF.
ENDLOOP. -
Hi,
Has anyone tried loading data records to HANA DB 1.00 Build 20 from the CTL files provided in https://wiki.wdf.sap.corp/wiki/display/HANADemoenvironment/HowtoloadCOPAdata ?
It was work on HANA DB 1.00 Build 15 but not with New Build...
The procedure was exactly the same
The creation of Tables is done but i have this error with import data with each csv files:
Command with HANA Studio (Build 20):
load from '/tmp/export/TCURT.ctl' threads 4 batch 1000;
Error:
Could not execute 'load from '/tmp/export/TCURT.ctl' threads 4 batch 1000'
SAP DBTech JDBC: [257]: sql syntax error: line 1 (at pos 3643214585)
I tried also: load from '/tmp/export/TCURT.ctl' -> same error...
We have all privileges on the tmp folder and files
I tried also to copy new data from source...
Do you have an idea?
Thanks for advance for help .
Best regards,
Eric MGHello,
LOAD TABLE command was not public - full syntax was never mentioned in any SAP guide on help.sap.com or Market Place (except SAP HANA Pocketbook-DRAFT). Therefore it is SAP internal syntax that can be changed anytime. And I guess this is what happened. Command is either deprecated or having different syntax.
If you need to import CVS file I would suggest to use official way - either IMPORT command (which was also not very public before but since SP03 it is mentioned in SQL Script Guide) or best using SAP HANA Studio import functionality (just be sure you use CVS as type of file).
You can find explanation of both ways in this blog:
SAP HANA - Modeling Content Migration - part 2: Import, Post-migration Activities
Tomas -
Error on inserting data into List
HI, i need to retrieve a data from one list and update another list. Currently im updating the list by batch processing method ( creating batch xml). its works when data is around 300-400. but for large data, it throws time out error. kindly advice how
can i achieve this. Its urgent. Im using sharepoint 2010 online, so that im unable to increase time out .any other way to achieve this..
Anand BHi,
Per my knowledge, it is not supported to increase the limitation of the timeout for SharePoint online.
For SharePoint server on-premise, we can increase the timeout in the configure file, however, we could not access the configure file in SharePoint online.
http://office.microsoft.com/en-in/office365-sharepoint-online-enterprise-help/upload-files-to-a-library-HA102803549.aspx
http://sharepoint.stackexchange.com/questions/66784/increase-sharepoint-execution-timeout-in-sharepoint-online-office-365
As this is the forum for SharePoint server on-premise, I’m not familiar with the SharePoint online, you can post the issue to the forum for SharePoint online.
http://social.technet.microsoft.com/Forums/msonline/en-US/home?forum=onlineservicessharepoint
More experts will assist you, then you will get more information relation to SharePoint Online.
Thank you for your understanding and support.
Thanks & Regards,
Jason
Jason Guo
TechNet Community Support -
Hi Gurus,
I have a program with rounddown quantities in deliveries notes. This program is run in background with all other programs. When we checked the batch job error log, we found out that when it gets to this Rounddown program, there are some SAP generated errors messages (examples are listed below).
This issue is there is no way to know exactly which deliveries the error is generated from and if the program actually rounddown the quantities of all the deliveries passed to it and so on, because this error messages are not specific enough (like we can see below).
This program passed this new rounddown quantity number from the internal table using a BAPI.
Is there a way to change this SAP generated error messages to be specific enough like stating which actual delivery number is been affected or even stop at the error and send an email to the user to change the delivery or something in that manner, OR is there a way we can replace this SAP generated error messages with our own messages and send email out to the user.
Error Messages:-
01/16/2008 01:40:40 Step 005 started (program ZSDSO_DELIVERY_QTY_ROUNDDOWN, variant , user ID AUTOSYSUSER
01/16/2008 01:51:35 Item 000020 belongs to delivery group 001
01/16/2008 01:52:13 Item 000020 belongs to delivery group 001
01/16/2008 01:52:15 Item 000030 belongs to delivery group 001
01/16/2008 01:52:19 Item 000020 belongs to delivery group 001
01/16/2008 01:52:29 Dynamic credit check has been exceeded 23,984.52 USD
01/16/2008 01:52:30 Item 000020 (change quantity manually to 400 PC because of complex struct
01/16/2008 01:52:30 Item 000030 (change quantity manually to 400 PC because of complex struct
01/16/2008 01:52:30 Quantity correlation for dependent items has been carried out
01/16/2008 01:52:30 Item 000020 belongs to delivery group 008
01/16/2008 01:52:30 Item 000030 belongs to delivery group 008
01/16/2008 01:52:31 Dynamic credit check has been exceeded 23,984.52 USD
01/16/2008 01:52:32 Delivery quantity must be entered for the item
01/16/2008 01:52:34 Dynamic credit check has been exceeded 23,984.52 USD
01/16/2008 01:53:05 Item 000020 belongs to delivery group 001
01/16/2008 01:54:40 Dynamic credit check has been exceeded 15,501.88 USD
01/16/2008 01:55:53 Item 000020 belongs to delivery group 001
Thanks.
Points will be awarded.This is the Subrouting Form code.
FORM call_bapi_delivery_change TABLES p_i_lips LIKE i_lips.
DATA: w_header_data TYPE bapiobdlvhdrchg,
w_header_control TYPE bapiobdlvhdrctrlchg,
w_delivery TYPE bapiobdlvhdrchg-deliv_numb.
DATA: i_item_data TYPE STANDARD TABLE OF bapiobdlvitemchg,
w_item_data TYPE bapiobdlvitemchg,
i_item_control TYPE STANDARD TABLE OF bapiobdlvitemctrlchg,
w_item_control TYPE bapiobdlvitemctrlchg,
i_return TYPE STANDARD TABLE OF bapiret2,
w_return TYPE bapiret2.
LOOP AT p_i_lips INTO w_lips.
w_item_data-deliv_numb = w_lips-vbeln.
w_item_data-deliv_item = w_lips-posnr.
w_item_data-material = w_lips-matnr.
w_item_data-batch = w_lips-charg.
w_item_data-dlv_qty = w_lips-lfimg.
w_item_data-dlv_qty_imunit = w_lips-lfimg.
w_item_data-fact_unit_nom = 1.
w_item_data-fact_unit_denom = 1.
APPEND w_item_data TO i_item_data.
CLEAR w_item_control.
w_item_control-deliv_numb = w_lips-vbeln.
w_item_control-deliv_item = w_lips-posnr.
IF w_lips-delete NE 'X'.
w_item_control-chg_delqty = 'X'.
ELSEIF w_lips-delete EQ 'X'.
w_item_control-del_item = 'X'.
ENDIF.
APPEND w_item_control TO i_item_control.
ENDLOOP.
w_header_data-deliv_numb = w_lips-vbeln.
w_header_control-deliv_numb = w_lips-vbeln.
w_delivery = w_lips-vbeln.
CALL FUNCTION 'BAPI_OUTB_DELIVERY_CHANGE'
EXPORTING
header_data = w_header_data
header_control = w_header_control
delivery = w_delivery
TECHN_CONTROL =*
TABLES
HEADER_PARTNER =*
HEADER_PARTNER_ADDR =*
HEADER_DEADLINES =*
item_data = i_item_data
item_control = i_item_control
ITEM_SERIAL_NO =*
SUPPLIER_CONS_DATA =*
EXTENSION1 =*
EXTENSION2 =*
return = i_return
TOKENREFERENCE =*
IF sy-subrc EQ 0.
CALL FUNCTION 'BAPI_TRANSACTION_COMMIT'
EXPORTING
wait = 'X'.
w_zbdcopendn-otce706 = 'X'.
MODIFY i_zbdcopendn FROM w_zbdcopendn TRANSPORTING otce706
WHERE delivery EQ w_deliveries-vbeln.
ELSE.
CALL FUNCTION 'BAPI_TRANSACTION_ROLLBACK'.
DELETE i_zbdcopendn WHERE delivery EQ w_deliveries-vbeln.
DELETE i_deliveries WHERE vbeln EQ w_lips-vbeln.*
w_bad_delv-vbeln = w_deliveries-vbeln.
APPEND w_bad_delv TO i_bad_delv.
ENDIF.
ENDFORM. " call_bapi_delivery_change
Thanks again. -
Error when Publish data to passive cache
Hi,
I am using active-passive push replication. When I add data to active cache, the data added to cache but get the below error during publish data to passive cache.
011-03-31 22:38:06.014/291.473 Oracle Coherence GE 3.6.1.0 <Error> (thread=Proxy:ExtendTcpProxyService:TcpAcceptor, member=1): Failed to publish EntryOperation{siteName=site1, clusterName=Cluster1, cacheName=dist-contact-cache, operation=Insert, publishableEntry=PublishableEntry{key=Binary(length=12, value=0x154E094D65656E616B736869), value=Binary(length=88, value=0x12813A15A90F00004E094D65656E61
6B736869014E07506C6F74203137024E074368656E6E6169034E0954616D696C4E616475044E06363030303432401A155B014E0524737263244E0E73697465312D436C757374657231), originalValue=Binary(length=0, value=0x)}} to Cache dist-contact-cache because of
(Wrapped) java.io.StreamCorruptedException: invalid type: 78 Class:com.oracle.coherence.patterns.pushreplication.publishers.cache.AbstractCachePublisher
2011-03-31 22:38:06.014/291.473 Oracle Coherence GE 3.6.1.0 <D5> (thread=Proxy:ExtendTcpProxyService:TcpAcceptor, member=1): An exception occurred while processing a InvocationRequest for Service=Proxy:ExtendTcpProxyService:TcpAcceptor: (Wrapped: Failed to publish a batch with the publisher [Active Publisher] on cache[dist-contact-cache]) java.lang.IllegalStateException: Attempted to publish to cache dist-contact-cache
Here is my coherence cache config xml file
!DOCTYPE cache-config SYSTEM "cache-config.dtd">
<cache-config xmlns:sync="class:com.oracle.coherence.patterns.pushreplication.configuration.PushReplicationNamespaceContentHandler">
<caching-schemes>
<sync:provider pof-enabled="true">
<sync:coherence-provider />
</sync:provider>
<caching-scheme-mapping>
<cache-mapping>
<cache-name>dist-contact-cache</cache-name>
<scheme-name>distributed-scheme-with-publishing-cachestore</scheme-name>
<sync:publisher>
<sync:publisher-name>Active Publisher</sync:publisher-name>
<sync:publisher-scheme>
<sync:remote-cluster-publisher-scheme>
<sync:remote-invocation-service-name>remote-site2</sync:remote-invocation-service-name>
<sync:remote-publisher-scheme>
<sync:local-cache-publisher-scheme>
<sync:target-cache-name>dist-contact-cache</sync:target-cache-name>
</sync:local-cache-publisher-scheme>
</sync:remote-publisher-scheme>
<sync:autostart>true</sync:autostart>
</sync:remote-cluster-publisher-scheme>
</sync:publisher-scheme>
</sync:publisher>
</cache-mapping>
</caching-scheme-mapping>
<proxy-scheme>
<service-name>ExtendTcpProxyService</service-name>
<thread-count>5</thread-count>
<acceptor-config>
<tcp-acceptor>
<local-address>
<address>localhost</address>
<port>9099</port>
</local-address>
</tcp-acceptor>
</acceptor-config>
<autostart>true</autostart>
</proxy-scheme>
<remote-invocation-scheme>
<service-name>remote-site2</service-name>
<initiator-config>
<tcp-initiator>
<remote-addresses>
<socket-address>
<address>localhost</address>
<port>5000</port>
</socket-address>
</remote-addresses>
<connect-timeout>2s</connect-timeout>
</tcp-initiator>
<outgoing-message-handler>
<request-timeout>5s</request-timeout>
</outgoing-message-handler>
</initiator-config>
<serializer>
<class-name>com.tangosol.io.pof.ConfigurablePofContext</class-name>
</serializer>
</remote-invocation-scheme>
</caching-schemes>
</cache-config>Please find below for active and passive server start command, log files.
Thanks.
Active server start command
java -server -showversion -Xms128m -Xmx128m -Dtangosol.coherence.ttl=0 -Dtangosol.coherence.cacheconfig=active-cache-config.xml
-Dtangosol.coherence.cluster="Cluster1" -Dtangosol.coherence.site="site1" -Dtangosol.coherence.clusteraddress="224.3.6.2"
-Dtangosol.coherence.clusterport="3001" -Dtangosol.pof.config=pof-config.xml -cp "config;lib\custom-types.jar;lib\coherence.jar;lib/coherence-common-1.7.3.20019.jar;lib/coherence-pushreplicationpattern-3.0.3.20019.jar;lib/coherence-messagingpattern-2.7.3.20019.jar;" com.tangosol.net.DefaultCacheServer
Passive server start command
java -server -showversion -Xms128m -Xmx128m -Dtangosol.coherence.ttl=0 -Dtangosol.coherence.cacheconfig=passive-cache-config.xml
-Dtangosol.coherence.cluster="Cluster2" -Dtangosol.coherence.site="site2" -Dtangosol.coherence.clusteraddress="224.3.6.3"
-Dtangosol.coherence.clusterport="3003" -Dtangosol.pof.config=pof-config.xml -cp "config;lib\custom-types.jar;
lib\coherence.jar;lib/coherence-common-1.7.3.20019.jar;lib/coherence-pushreplicationpattern-3.0.3.20019.jar;
lib/coherence-messagingpattern-2.7.3.20019.jar;" com.tangosol.net.DefaultCacheServer
Active Server log
<Error> (thread=PublishingService:Thread-3, member=1): Failed to publish the range MessageTracker{MsgId{40-1} } of messages for subscription SubscriptionIdentifier{destinationIdentifier=Identifier{dist-contact-cache}, subscriberIdentifier=Identifier{Active Publisher}} with publisher com.oracle.coherence.patterns.pushreplication.publishers.RemoteClusterPublisher@423d4f Class:com.oracle.coherence.patterns.pushreplication.providers.coherence.CoherencePublishingService
2011-04-01 20:23:20.172/65.972 Oracle Coherence GE 3.6.1.0 <Info> (thread=PublishingService:Thread-3, member=1): Publisher Exception was as follows Class:com.oracle.coherence.patterns.pushreplication.providers.coherence.CoherencePublishingService
(Wrapped: Failed to publish a batch with the publisher [Active Publisher] on cache [dist-contact-cache]) java.lang.IllegalStateException: Attempted to publish t
o cache dist-contact-cache
at com.tangosol.util.Base.ensureRuntimeException(Base.java:293)
at com.oracle.coherence.patterns.pushreplication.publishers.RemoteClusterPublisher$RemotePublishingAgent.run(RemoteClusterPublisher.java:348)
at com.tangosol.coherence.component.net.extend.proxy.serviceProxy.InvocationServiceProxy.query(InvocationServiceProxy.CDB:6)
at com.tangosol.coherence.component.net.extend.messageFactory.InvocationServiceFactory$InvocationRequest.onRun(InvocationServiceFactory.CDB:12)
at com.tangosol.coherence.component.net.extend.message.Request.run(Request.CDB:4)
at com.tangosol.coherence.component.net.extend.proxy.serviceProxy.InvocationServiceProxy.onMessage(InvocationServiceProxy.CDB:9)
at com.tangosol.coherence.component.net.extend.Channel.execute(Channel.CDB:39)
at com.tangosol.coherence.component.net.extend.Channel.receive(Channel.CDB:26)
at com.tangosol.coherence.component.util.daemon.queueProcessor.service.Peer.onNotify(Peer.CDB:96)
at com.tangosol.coherence.component.util.Daemon.run(Daemon.CDB:42)
at java.lang.Thread.run(Thread.java:619)
Caused by: java.lang.IllegalStateException: Attempted to publish to cache dist-contact-cache
at com.oracle.coherence.patterns.pushreplication.publishers.cache.AbstractCachePublisher.publishBatch(AbstractCachePublisher.java:163)
at com.oracle.coherence.patterns.pushreplication.publishers.RemoteClusterPublisher$RemotePublishingAgent.run(RemoteClusterPublisher.java:343)
... 9 more
Caused by: (Wrapped) java.io.StreamCorruptedException: invalid type: 78
at com.tangosol.util.ExternalizableHelper.fromBinary(ExternalizableHelper.java:266)
at com.tangosol.coherence.component.util.daemon.queueProcessor.service.grid.PartitionedService$ConverterKeyToBinary.convert(PartitionedService.CDB:16)
at com.tangosol.util.ConverterCollections$ConverterInvocableMap.invoke(ConverterCollections.java:2156)
at com.tangosol.util.ConverterCollections$ConverterNamedCache.invoke(ConverterCollections.java:2622)
at com.tangosol.coherence.component.util.daemon.queueProcessor.service.grid.partitionedService.PartitionedCache$ViewMap.invoke(PartitionedCache.CDB:11)
at com.tangosol.coherence.component.util.SafeNamedCache.invoke(SafeNamedCache.CDB:1)
at com.oracle.coherence.patterns.pushreplication.publishers.cache.AbstractCachePublisher.publishBatch(AbstractCachePublisher.java:142)
... 10 moreCaused by: java.io.StreamCorruptedException: invalid type: 78
at com.tangosol.util.ExternalizableHelper.readObjectInternal(ExternalizableHelper.java:2266)
at com.tangosol.util.ExternalizableHelper.readObject(ExternalizableHelper.java:2254)
at com.tangosol.io.DefaultSerializer.deserialize(DefaultSerializer.java:74)
at com.tangosol.util.ExternalizableHelper.deserializeInternal(ExternalizableHelper.java:2708)
at com.tangosol.util.ExternalizableHelper.fromBinary(ExternalizableHelper.java:262)
... 16 more
Passive Server log
2011-04-01 20:23:20.141/56.925 Oracle Coherence GE 3.6.1.0 <Error> (thread=Proxy:ExtendTcpProxyService:TcpAcceptor, member=1): Failed to publish EntryOperation{siteName=site1, clusterName=Cluster1, cacheName=dist-contact-cache, operation=Insert, publishableEntry=PublishableEntry{key=Binary(length=12, value=0x154E094D65656E616B736869), value=Binary(length=88, value=0x12813A15A90F00004E094D65656E616
B736869014E07506C6F74203137024E074368656E6E6169034E0954616D696C4E616475044E06363030303432401A155B014E0524737263244E0E73697465312D436C757374657231), originalValue=Binary(length=0, value=0x)}} to Cache dist-contact-cache because of (Wrapped) java.io.StreamCorruptedException: invalid type: 78 Class:com.oracle.coherence.patterns.pushreplication.publishers.cache.AbstractCachePublisher2011-04-01 20:23:20.141/56.925 Oracle Coherence GE 3.6.1.0 <D5> (thread=Proxy:ExtendTcpProxyService:TcpAcceptor, member=1): An exception occurred while processing a InvocationRequest for Service=Proxy:ExtendTcpProxyService:TcpAcceptor: (Wrapped: Failed to publish a batch with the publisher [Active Publisher] on cache [
dist-contact-cache]) java.lang.IllegalStateException: Attempted to publish to cache dist-contact-cache
at com.tangosol.util.Base.ensureRuntimeException(Base.java:293)
at com.oracle.coherence.patterns.pushreplication.publishers.RemoteClusterPublisher$RemotePublishingAgent.run(RemoteClusterPublisher.java:348)
at com.tangosol.coherence.component.net.extend.proxy.serviceProxy.InvocationServiceProxy.query(InvocationServiceProxy.CDB:6)
at com.tangosol.coherence.component.net.extend.messageFactory.InvocationServiceFactory$InvocationRequest.onRun(InvocationServiceFactory.CDB:12)
at com.tangosol.coherence.component.net.extend.message.Request.run(Request.CDB:4)
at com.tangosol.coherence.component.net.extend.proxy.serviceProxy.InvocationServiceProxy.onMessage(InvocationServiceProxy.CDB:9)
at com.tangosol.coherence.component.net.extend.Channel.execute(Channel.CDB:39)
at com.tangosol.coherence.component.net.extend.Channel.receive(Channel.CDB:26)
at com.tangosol.coherence.component.util.daemon.queueProcessor.service.Peer.onNotify(Peer.CDB:96)
at com.tangosol.coherence.component.util.Daemon.run(Daemon.CDB:42)
at java.lang.Thread.run(Thread.java:619)
Caused by: java.lang.IllegalStateException: Attempted to publish to cache dist-contact-cache
at com.oracle.coherence.patterns.pushreplication.publishers.cache.AbstractCachePublisher.publishBatch(AbstractCachePublisher.java:163)
at com.oracle.coherence.patterns.pushreplication.publishers.RemoteClusterPublisher$RemotePublishingAgent.run(RemoteClusterPublisher.java:343)
... 9 more
Caused by: (Wrapped) java.io.StreamCorruptedException: invalid type: 78
at com.tangosol.util.ExternalizableHelper.fromBinary(ExternalizableHelper.java:266)
at com.tangosol.coherence.component.util.daemon.queueProcessor.service.grid.PartitionedService$ConverterKeyToBinary.convert(PartitionedService.CDB:16)
at com.tangosol.util.ConverterCollections$ConverterInvocableMap.invoke(ConverterCollections.java:2156)
at com.tangosol.util.ConverterCollections$ConverterNamedCache.invoke(ConverterCollections.java:2622)
at com.tangosol.coherence.component.util.daemon.queueProcessor.service.grid.partitionedService.PartitionedCache$ViewMap.invoke(PartitionedCache.CDB:11)
at com.tangosol.coherence.component.util.SafeNamedCache.invoke(SafeNamedCache.CDB:1)
at com.oracle.coherence.patterns.pushreplication.publishers.cache.AbstractCachePublisher.publishBatch(AbstractCachePublisher.java:142)
... 10 more
Caused by: java.io.StreamCorruptedException: invalid type: 78
at com.tangosol.util.ExternalizableHelper.readObjectInternal(ExternalizableHelper.java:2266)
at com.tangosol.util.ExternalizableHelper.readObject(ExternalizableHelper.java:2254)
at com.tangosol.io.DefaultSerializer.deserialize(DefaultSerializer.java:74)
at com.tangosol.util.ExternalizableHelper.deserializeInternal(ExternalizableHelper.java:2708)
at com.tangosol.util.ExternalizableHelper.fromBinary(ExternalizableHelper.java:262)
... 16 more -
Incorrect XML disply for ALERTCATEGORY error during deserialization data lo
Hi,
I am getting below error while executing first scenario in Quality enviornment, i have not defined and trasported any ALERTS.
Incorrect XML disply for ALERTCATEGORY error during deserialization data loss occured when converting 011136c23c9430359b57c0ef8949c630
this error stoping my interface executing...did anyone experienced this issue??
put some light on this.
many thanks,
RajHave you checked this SAP Note # 1394710 ?
-
Hello all,
I execute a T-SQL query, which will retrieve large amount of data, however, I get an error message after a few seconde.
The error message is :
An error occurred while executing batch. Error message is: There is not enough space on the disk.
Who can tell me the reason and waht's the steps to solve this issue ?
Really thanks.
Thanks, Jed dengThe other two posters suggested that the problem with tempdb or you data/log files. This is not the case here. The error message relates to the client-side of things. Apparently SSMS needs to store data on disk. I don't recall exactly when this happens,
but I seem to recall that the files appear in you %TEMP% directory.
I would suggest that your correct action is to clean up the disk. Not the least there may be a lot of junk in your %TEMP% and %TMP% directories.
Erland Sommarskog, SQL Server MVP, [email protected]
Thanks Erland, I was not aware of this earlier. Now, I am able to find out one of the resons for slowness of SSMS gird. Also, I have just checked it with process monitor, SSMS is creating tmp%.tmp files..
- Chintak (My Blog) -
Error -2000 necessary data reference could not be resolved-- HELP!!!
So, I work in a nurse's office at a medical school editing video on an iMac 2.33 GHx Intel Core 2 Duo with 3GB 667 MHz DDR2 SDRAM. Mine is the only mac in the office, the rest of the secretaries, nurses, and grad students are on windows XP.
My job is basically to edit videos (color correct, blur faces of patients' relatives, etc) from a study they did, and export the videos to a DVD (as a data disk, not a DVD with menus) so they can show them to each other and various committees on their windows machines.
For the longest time, this went off without a hitch, but most recently, I tried burning some .mov files to disk, and whenever they play on any machine (windows xp) we get "Error -2000 necessary data reference could not be resolved." I re-burned the disks, and now, they are selective! They play on some of the office computers, but not all... including the target machine where they will ultimately be screened for a committee... even on the 2nd batch of disks, that error is still present...
I know about video editing, and to some degree disk burning (I used the "New-->burn folder" option to create the data disk, burning them at 2x speed for reliability). But I don't know enough about computers to figure out what this error means, or why these disks won't work!! Somebody please help!
(and you can save the sales pitch... I'm a mac user through and through, and there is no way in heck my office will ever have enough money or motivation to switch over)
Thanks for your help!!I wont address the problem but that you should have your videos. Dont you still have them on your camera? If not, just get it from your friend. Btw, do this the next tíme around ... Whenever you plug in the camera and your iPhoto/pic Application opens up, BEFORE you have iPhoto import them, use Finder to copy the photos over to your HDD or an external drive. This is so that YOU always have the originals BEFORE your pic Application does.
Maybe you are looking for
-
Problem with set/get attributes
Hello, i'm trying to assign and object i've got to the ServletContext by : getServletContext().setAttribute("theairwaybill",theairwaybill); and i''ve checked and it's there. now, trying to get the object back in my jsp page by: theairwaybill = (Shipm
-
What are the CRM Trasaction data Data Sources and Data load Procedure
Hi BI Gurus, Does anybody provide the CRM Transaction data DataSources names and Load procedure into BI7.0 I know the master Data load procedure from CRM to BI7.0. if you provide Step-by-Step documents it is more help. Thanks in Advance, Venkat
-
I have just bought an iPhone with vodafone pay as you go sim. It was meant to come with a freedom freebie credit allowance. However I need a 12digit code to activate this. There is no code on my receipt. Where would I find this?
-
Is CC not compatible with 10.8.5?
I'm running a new mac with 10.8.5 (still running Mountain Lion, not Mavericks). Last week I signed up for CC for teams for myself and one coworker. I am the team admin, so I invited myself. After the CC "invite" confirmation email, my coworker has ha
-
Hello, I am relatively inexperienced with iWeb. I have published one little page with a few photos just for experimentation purposes. But now I am trying to do a page with vacation photos and videos to share with family and it won't publish. Here is