ORA-13464: error loading GeoRaster data:
Hi:
I trying to load raster image in SQL PLUS WORKSHEET(gis/*****@SDB as sysdba ),Using SQL Scripts ,but getted error:
declare
error at 1 line:
the Permission (java.io.FilePermission c:\temp\AnShan.tif read) has not been granted to GIS. The PL/SQL to grant this is dbms_java.grant_permission( 'GIS', 'SYS:java.io.FilePermission', 'c:\temp\AnShan.tif', 'read' )
ORA-06512: 在 "MDSYS.MD", line 1723
ORA-06512: 在 "MDSYS.MDERR", line 17
ORA-06512: 在 "MDSYS.SDO_GEOR", line 2889
ORA-06512: 在 line 9
the scripts:
declare
geor MDSYS.SDO_GEORASTER;
begin
insert into gis.RasterImages values( 4, 'TIFF', mdsys.sdo_geor.init('rdt_1', 4) );
-- import the first TIFF image
select image into geor from gis.RasterImages where geoid = 4 for update;
mdsys.sdo_geor.importFrom(geor, '', 'TIFF', 'file','c:\temp\AnShan.tif');
update gis.RasterImages set image = geor where geoid = 4;
commit;
end;
I am sure that I have given read priv for the file also,
First,I create conn with system /*****,and execute SQL:
call dbms_java.grant_permission('GIS', 'SYS:java.io.FilePermission','c:\temp\AnShan.tiff', 'read' );
call dbms_java.grant_permission('PUBLIC', 'SYS:java.io.FilePermission',
'c:\temp\AnShan.tiff', 'read' );
call dbms_java.grant_permission('MDSYS', 'SYS:java.io.FilePermission','c:\temp\AnShan.tif', 'read' );
the sentences executed successfully.I am not sure what went wrong.
Please help me!
LGS
I think the new error you saw was due to the cross-schema issue on 10.1.0.2/10.1.0.3 (as described in another thread on creating pyramids).
You should connect as system to grant the permissions. With 10.1.0.2/10.1.0.3, importFrom should be called from the same schema where the GeoRaster tables are defined. (Upgrading to 10.1.0.4 or higher if you want to perform cross-schema operations.)
Other than the connection part, your scripts looked OK. Please make sure that the file with the specified file name does exist and is on the same machine where the database is running. Note that even if the grant_permission call is executed successfully, it does not necessarily mean the following file operations will always succeed. For example, grant_permission can be called successfully on a file that does not even exist. If a file with the specific name is later created, no permissions are granted on the new file. The case with the schema 'GIS' is similar. You need to make sure that the schema 'GIS' and the file do exist before the grant_permission call; and grant_permission can be called repeatedly, if necessary.
Hope this helps.
Similar Messages
-
Hello,
EBS version : 11.5.10.2
DB version : 11.2.0.3
OS version : AIX 6.1
As a part of 11.5.10.2 to R12.1.1 upgrade, while applying merged 12.1.1 upgrade driver(u6678700.drv), we got below error :
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
ATTENTION: All workers either have failed or are waiting:
FAILED: file glsupdas.ldt on worker 3.
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
drix10:/fmstop/r12apps/apps/apps_st/appl/admin/FMSTEST/log>tail -20 adwork003.log
Restarting job that failed and was fixed.
Time when worker restarted job: Wed Aug 07 2013 10:36:14
Loading data using FNDLOAD function.
FNDLOAD APPS/***** 0 Y UPLOAD @SQLGL:patch/115/import/glnlsdas.lct @SQLGL:patch/115/import/US/glsupdas.ldt -
Connecting to APPS......Connected successfully.
Calling FNDLOAD function.
Returned from FNDLOAD function.
Log file: /fmstop/r12apps/apps/apps_st/appl/admin/FMSTEST/log/US_glsupdas_ldt.log
Error calling FNDLOAD function.
Time when worker failed: Wed Aug 07 2013 10:36:14
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
drix10:/fmstop/r12apps/apps/apps_st/appl/admin/FMSTEST/log>tail -20 US_glsupdas_ldt.log
Current system time is Wed Aug 7 10:36:14 2013
Uploading from the data file /fmstop/r12apps/apps/apps_st/appl/gl/12.0.0/patch/115/import/US/glsupdas.ldt
Altering database NLS_LANGUAGE environment to AMERICAN
Dumping from LCT/LDT files (/fmstop/r12apps/apps/apps_st/appl/gl/12.0.0/patch/115/import/glnlsdas.lct(120.0), /fmstop/r12apps/apps/apps_st/appl/gl/12.0.0/patch/115/import/US/glsupdas.ldt) to staging tables
Dumping LCT file /fmstop/r12apps/apps/apps_st/appl/gl/12.0.0/patch/115/import/glnlsdas.lct(120.0) into FND_SEED_STAGE_CONFIG
Dumping LDT file /fmstop/r12apps/apps/apps_st/appl/gl/12.0.0/patch/115/import/US/glsupdas.ldt into FND_SEED_STAGE_ENTITY
Dumped the batch (GL_DEFAS_ACCESS_SETS SUPER_USER_DEFAS , GL_DEFAS_ACCESS_SETS SUPER_USER_DEFAS ) into FND_SEED_STAGE_ENTITY
Uploading from staging tables
Error loading seed data for GL_DEFAS_ACCESS_SETS: DEFINITION_ACCESS_SET = SUPER_USER_DEFAS, ORA-06508: PL/SQL: could not find program unit being called
Concurrent request completed
Current system time is Wed Aug 7 10:36:14 2013
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++Below is info about file versions and INVALID packages related to GL.
PACKAGE BODY GL_DEFAS_ACCESS_SETS_PKG is invalid with error component 'GL_DEFAS_DBNAME_S' must be declared.
I can see GL_DEFAS_DBNAME_S is a VALID sequence accessible by apps user with or without specifying GL as owner.
SQL> select text from dba_source where name in ('GL_DEFAS_ACCESS_DETAILS_PKG','GL_DEFAS_ACCESS_SETS_PKG') and line=2;
TEXT
/* $Header: glistdds.pls 120.4 2005/05/05 01:23:16 kvora ship $ */
/* $Header: glistddb.pls 120.16 2006/04/10 21:28:48 cma ship $ */
/* $Header: glistdas.pls 120.4 2005/05/05 01:23:02 kvora ship $ */
/* $Header: glistdab.pls 120.5 2006/03/13 19:56:21 cma ship $ */
SQL> select * from all_objects where object_name in ('GL_DEFAS_ACCESS_DETAILS_PKG','GL_DEFAS_ACCESS_SETS_PKG')
2 ; OWNER OBJECT_NAME SUBOBJECT_NAM OBJECT_ID DATA_OBJECT_ID OBJECT_TYPE CREATED LAST_DDL_TIME TIMESTAMP STATUS T G S NAMESPACE
EDITION_NAME
APPS GL_DEFAS_ACCESS_DETAILS_PKG 1118545 PACKAGE 05-AUG-13 05-AUG-13 2013-08-05:18:54:51 VALID N N N 1
APPS GL_DEFAS_ACCESS_SETS_PKG 1118548 PACKAGE 05-AUG-13 06-AUG-13 2013-08-05:18:54:51 VALID N N N 1
APPS GL_DEFAS_ACCESS_SETS_PKG 1128507 PACKAGE BODY 05-AUG-13 06-AUG-13 2013-08-06:12:56:50 INVALID N N N 2
APPS GL_DEFAS_ACCESS_DETAILS_PKG 1128508 PACKAGE BODY 05-AUG-13 05-AUG-13 2013-08-05:19:43:51 VALID N N N 2
SQL> select * from all_objects where object_name='GL_DEFAS_DBNAME_S'; OWNER OBJECT_NAME SUBOBJECT_NAM OBJECT_ID DATA_OBJECT_ID OBJECT_TYPE CREATED LAST_DDL_TIME TIMESTAMP STATUS T G S NAMESPACE
EDITION_NAME
GL GL_DEFAS_DBNAME_S 1087285 SEQUENCE 05-AUG-13 05-AUG-13 2013-08-05:17:34:43 VALIDN N N 1
APPS GL_DEFAS_DBNAME_S 1087299 SYNONYM 05-AUG-13 05-AUG-13 2013-08-05:17:34:43 VALIDN N N 1
SQL> conn apps/apps
Connected.
SQL> SELECT OWNER, OBJECT_NAME, OBJECT_TYPE, STATUS
FROM DBA_OBJECTS
WHERE OBJECT_NAME = 'GL_DEFAS_ACCESS_SETS_PKG'; 2 3 OWNER OBJECT_NAME OBJECT_TYPE STATUS
APPS GL_DEFAS_ACCESS_SETS_PKG PACKAGE VALID
APPS GL_DEFAS_ACCESS_SETS_PKG PACKAGE BODY INVALID SQL> ALTER PACKAGE GL_DEFAS_ACCESS_SETS_PKG COMPILE; Warning: Package altered with compilation errors. SQL> show error
No errors.
SQL> ALTER PACKAGE GL_DEFAS_ACCESS_SETS_PKG COMPILE BODY; Warning: Package Body altered with compilation errors. SQL> show error
Errors for PACKAGE BODY GL_DEFAS_ACCESS_SETS_PKG: LINE/COL ERROR
39/17 PLS-00302: component 'GL_DEFAS_DBNAME_S' must be declared
drix10:/fmstop/r12apps/apps/apps_st/appl/gl/12.0.0/patch/115/odf>cat $GL_TOP/patch/115/sql/glistdab.pls|grep -n GL_DEFAS_DBNAME_S
68: SELECT GL.GL_DEFAS_DBNAME_S.NEXTVAL
81: fnd_message.set_token('SEQUENCE', 'GL_DEFAS_DBNAME_S');
SQL> show user
USER is "APPS"
SQL> SELECT GL.GL_DEFAS_DBNAME_S.NEXTVAL
FROM dual; 2 -- with GL.
NEXTVAL
1002
SQL> SELECT GL_DEFAS_DBNAME_S.NEXTVAL from dual; --without GL. or using synonym.
NEXTVAL
1003
drix10:/fmstop/r12apps/apps/apps_st/appl/gl/12.0.0/patch/115/odf>strings -a $GL_TOP/patch/115/sql/glistdab.pls|grep '$Header'
REM | $Header: glistdab.pls 120.5 2006/03/13 19:56:21 cma ship $ |
/* $Header: glistdab.pls 120.5 2006/03/13 19:56:21 cma ship $ */
drix10:/fmstop/r12apps/apps/apps_st/appl/gl/12.0.0/patch/115/odf>strings -a $GL_TOP/patch/115/sql/glistdas.pls |grep '$Header'
REM | $Header: glistdas.pls 120.4 2005/05/05 01:23:02 kvora ship $ |
/* $Header: glistdas.pls 120.4 2005/05/05 01:23:02 kvora ship $ */ -
Export/Import error : ORA-31609: error loading file "kualter.xsl"
Hi ,
I was exporting the database , I got the following error after that I found that dbms_metadata_util.load_stylesheets needs to run . Please assit me how to procced further .
C:\Users\379164>expdp import1/import1 directory=dmp_dir dumpfile=dump.dmp full=
Export: Release 10.2.0.1.0 - Production on Friday, 15 October, 2010 14:50:45
Copyright (c) 2003, 2005, Oracle. All rights reserved.
Connected to: Oracle Database 10g Express Edition Release 10.2.0.1.0 - Production
ORA-39006: internal error
ORA-39213: Metadata processing is not available
====================================================================
SQL> execute dbms_metadata_util.load_stylesheets
BEGIN dbms_metadata_util.load_stylesheets; END;
ERROR at line 1:
ORA-31609: error loading file "kualter.xsl" from file system directory
"C:\oraclexe\app\oracle\product\10.2.0\server\rdbms\xml\xsl"
ORA-06512: at "SYS.DBMS_METADATA_UTIL", line 1793
ORA-06512: at line 1
Please help me on this ....
ThanksRefer following MOS note (even though the error number is different in header)
Ora-39213 Using Data Pump Export (Doc ID 402242.1) -
Cfreport & "Error loading image data"
i'm using cf report builder.
I insert in report an image linked to a url
If i execute the report i receive this error
Report data binding error Error loading image data :
http://DOMAIN/PATH/0701.jpg.
the image exists: if i copy the url to my browser, it appears
If i replace link with filesystem path, i receive the same
error
this is the error exception
"Error","jrpp-7","01/29/07","21:13:20","marcappelli_frontend","Report
data binding error Error loading image data :
http://venusia/lavoro/marsportcappelli_it/html/catalogo/2007/0701.jpg.Error
loading byte data :
http://venusia/lavoro/marsportcappelli_it/html/catalogo/2007/0701.jpg
The specific sequence of files included or processed is:
/home/htdocs/clienti/marsportcappelli_it/html/_script/catalogo2007.cfm
coldfusion.runtime.report.Report$ReportDataBindingException:
Report data binding error Error loading image data :
http://venusia/lavoro/marsportcappelli_it/html/catalogo/2007/0701.jpg.
at
coldfusion.runtime.report.Report.runReport(Report.java:303)
at
coldfusion.tagext.lang.ReportTag.doEndTag(ReportTag.java:467)
at
coldfusion.runtime.CfJspPage._emptyTag(CfJspPage.java:1909)
at
cfcatalogo20072ecfm389367973.runPage(/home/htdocs/clienti/marsportcappelli_it/html/_scrip t/catalogo2007.cfm:13)
at coldfusion.runtime.CfJspPage.invoke(CfJspPage.java:152)
at
coldfusion.tagext.lang.IncludeTag.doStartTag(IncludeTag.java:343)
at
coldfusion.filter.CfincludeFilter.invoke(CfincludeFilter.java:65)
at
coldfusion.filter.ApplicationFilter.invoke(ApplicationFilter.java:210)
at
coldfusion.filter.RequestMonitorFilter.invoke(RequestMonitorFilter.java:51)
at coldfusion.filter.PathFilter.invoke(PathFilter.java:86)
at
coldfusion.filter.ExceptionFilter.invoke(ExceptionFilter.java:50)
at
coldfusion.filter.BrowserDebugFilter.invoke(BrowserDebugFilter.java:52)
at
coldfusion.filter.ClientScopePersistenceFilter.invoke(ClientScopePersistenceFilter.java:2 8)
at
coldfusion.filter.BrowserFilter.invoke(BrowserFilter.java:38)
at
coldfusion.filter.GlobalsFilter.invoke(GlobalsFilter.java:38)
at
coldfusion.filter.DatasourceFilter.invoke(DatasourceFilter.java:22)
at coldfusion.CfmServlet.service(CfmServlet.java:105)
at
coldfusion.bootstrap.BootstrapServlet.service(BootstrapServlet.java:78)
at
jrun.servlet.ServletInvoker.invoke(ServletInvoker.java:91)
at
jrun.servlet.JRunInvokerChain.invokeNext(JRunInvokerChain.java:42)
at
jrun.servlet.JRunRequestDispatcher.invoke(JRunRequestDispatcher.java:257)
at
jrun.servlet.ServletEngineService.dispatch(ServletEngineService.java:527)
at
jrun.servlet.jrpp.JRunProxyService.invokeRunnable(JRunProxyService.java:204)
at
jrunx.scheduler.ThreadPool$DownstreamMetrics.invokeRunnable(ThreadPool.java:349)
at
jrunx.scheduler.ThreadPool$ThreadThrottle.invokeRunnable(ThreadPool.java:457)
at
jrunx.scheduler.ThreadPool$UpstreamMetrics.invokeRunnable(ThreadPool.java:295)
at jrunx.scheduler.WorkerThread.run(WorkerThread.java:66)
any ideas?
any help is very appreciates!
thanks
RobI think the new error you saw was due to the cross-schema issue on 10.1.0.2/10.1.0.3 (as described in another thread on creating pyramids).
You should connect as system to grant the permissions. With 10.1.0.2/10.1.0.3, importFrom should be called from the same schema where the GeoRaster tables are defined. (Upgrading to 10.1.0.4 or higher if you want to perform cross-schema operations.)
Other than the connection part, your scripts looked OK. Please make sure that the file with the specified file name does exist and is on the same machine where the database is running. Note that even if the grant_permission call is executed successfully, it does not necessarily mean the following file operations will always succeed. For example, grant_permission can be called successfully on a file that does not even exist. If a file with the specific name is later created, no permissions are granted on the new file. The case with the schema 'GIS' is similar. You need to make sure that the schema 'GIS' and the file do exist before the grant_permission call; and grant_permission can be called repeatedly, if necessary.
Hope this helps. -
Error 'Loading Textfile data into external table'
I am using Apex 2.0, Oralce Database 10G and Internet Explorer (version 6.5)
I tried to run following code using SQLCommands of SQLWORKSHOP of Apex
Code Here:
declare
ddl1 varchar2(200);
ddl2 varchar2(4000);
begin
ddl1 := 'create or replace directory data_dir as
''C:\LAUSD_DATA\''';
execute immediate ddl1;
ddl2:= 'create table tbl_temp_autoload
(ID NUMBER,
RECTYPE VARCHAR2(2),
EXPORTNBR NUMBER(2),
EXPORTDATE DATE,
BIMAGEID VARCHAR2(11),
APPLICNBR NUMBER(10),
MEALIMAGE VARCHAR2(17),
MEDIIMAGE VARCHAR2(17),
LANGUAGE VARCHAR2(1),
APPLICCNT NUMBER(1),
OTHAPPLICNBR VARCHAR2(10),
PEDETDATE DATE,
PESTATUS VARCHAR2(1),
ERRORREMARKS VARCHAR2(50),
COMMENTS VARCHAR2(50),
SID1 VARCHAR2(10),
WID1 NUMBER(10),
MEDIROW1 VARCHAR2(1),
LASTNAME1 VARCHAR2(20),
FIRSTNAME1 VARCHAR2(20),
SCHOOL1 VARCHAR2(15),
LOCCD1 VARCHAR2(4),
BIRTHDATE1 DATE,
CASENBR1 VARCHAR2(15),
FOSTER1 VARCHAR2(1),
CHILDINC1 VARCHAR2(4),
RACEAIAN VARCHAR2(1),
RACEBAA VARCHAR2(1),
RACENHPI VARCHAR2(1),
RACEASIAN VARCHAR2(1),
RACEWHITE VARCHAR2(1),
HISPANIC VARCHAR2(1),
NOTHISPANIC VARCHAR2(1),
ADULTSIG3 VARCHAR2(1),
ADULTSIGDATE3 VARCHAR2(10),
ADULTNAME3 VARCHAR2(30),
SSN3 VARCHAR2(1),
SSN3NOT VARCHAR2(1),
ADDRESS VARCHAR2(30),
APTNBR VARCHAR2(6),
CTY VARCHAR2(20),
ZIP NUMBER(5),
PHONEHOME VARCHAR2(12),
PHONEWORK VARCHAR2(12),
PHONEEXTEN VARCHAR2(6),
MEALROWA VARCHAR2(1),
LNAMEA VARCHAR2(20),
FNAMEA VARCHAR2(20),
MINITA VARCHAR2(6),
GENDERA VARCHAR2(1),
AGEA NUMBER(2),
MOTYPEA VARCHAR2(1),
MONAMEA VARCHAR2(1),
FATYPEA VARCHAR2(1),
FANAMEA VARCHAR2(1),
FAMNBR NUMBER(1),
PARENTINC NUMBER(5),
FAMILYINC NUMBER(5),
FAMILYSIZE NUMBER(2),
PGSIG1 VARCHAR2(1),
PGNAME1 VARCHAR2(30),
PGSIGDATE1 VARCHAR2(10),
FAMSIZE1 VARCHAR2(2),
FAMINC1 VARCHAR2(6),
PGSIG2 VARCHAR2(1),
PGNAME2 VARCHAR2(30),
PGSIGDATE2 DATE,
FAMSIZE2 VARCHAR2(2),
FAMINC2 VARCHAR2(4),
GRADE NUMBER(2),
SCHOOL VARCHAR2(40),
RACE VARCHAR2(4),
MEDI_CALID VARCHAR2(15),
MEDSTATUS VARCHAR2(1),
ADULT1NAME VARCHAR2(40),
ADULT1TYPE VARCHAR2(1),
ADULT1INC1 VARCHAR2(5),
ADULT1INC2 VARCHAR2(5),
ADULT1INC3 VARCHAR2(5),
ADULT1INC4 VARCHAR2(5),
ADULT2NAME VARCHAR2(40),
ADULT2TYPE VARCHAR2(1),
ADULT2INC1 VARCHAR2(5),
ADULT2INC2 VARCHAR2(5),
ADULT2INC3 VARCHAR2(5),
ADULT2INC4 VARCHAR2(5),
ADULT3NAME VARCHAR2(40),
ADULT3TYPE VARCHAR2(1),
ADULT3INC1 VARCHAR2(5),
ADULT3INC2 VARCHAR2(5),
ADULT3INC3 VARCHAR2(5),
ADULT3INC4 VARCHAR2(5),
ADULT4NAME VARCHAR2(40),
ADULT4TYPE VARCHAR2(1),
ADULT4INC1 VARCHAR2(5),
ADULT4INC2 VARCHAR2(5),
ADULT4INC3 VARCHAR2(5),
ADULT4INC4 VARCHAR2(5),
ADULT5NAME VARCHAR2(40),
ADULT5TYPE VARCHAR2(1),
ADULT5INC1 VARCHAR2(5),
ADULT5INC2 VARCHAR2(5),
ADULT5INC3 VARCHAR2(5),
ADULT5INC4 VARCHAR2(5),
ADULT6NAME VARCHAR2(40),
ADULT6TYPE VARCHAR2(1),
ADULT6INC1 VARCHAR2(5),
ADULT6INC2 VARCHAR2(5),
ADULT6INC3 VARCHAR2(5),
ADULT6INC4 VARCHAR2(5),
AGE1LT19 VARCHAR2(1),
AGE2LT19 VARCHAR2(1),
AGE3LT19 VARCHAR2(1),
AGE4LT19 VARCHAR2(1),
AGE5LT19 VARCHAR2(1),
AGE6LT19 VARCHAR2(1),
MIDINIT1 VARCHAR2(1)
organization external
( type oracle_loader
default directory data_dir
access parameters
( fields terminated by '','' )
location (''DataJuly07.txt'')
execute immediate ddl2;
end;
Error Received:
ORA-29913:error in executing ODCIEXITTABLEFETCH callout ORA-30653: reject limit reached
Please help ASAP. I am new with oracle and Apex. Any help will be appreciated greatly.I downloaded the External table simple application from the Apex Packaged application and installed it. It installed successfully but when I run it - it doesn't load the data eventhough I get a message stating it was successful. I am running it on OracleXE - Apex 3.0.1.
In addition, I tried running the stored procedure directly (ext_employees_load) in sql-developer and received the ora-30648.
Please help.
thanks,
Tom -
ORA-12154 Error loading the I$
Hi All,
When I try to run an interface it is failing with below error when loading the data to I$ table. It succesfully created the I$ table but failing when loading the data only. Can anyone help me with this
ORA-12154: TNS:could not resolve the connect identifier specified,
ThanksHi Phil,
I am running the interface with a linux agent. Below is the code which is failing. staging area is same as the target. source an target conection is the same. I checked the TNS file also for the connection details. It is good.
insert /*+ APPEND */ into MANHATTANDR.I$_ASN_HDR
SHPMT_NBR,
TO_WHSE,
MANIF_NBR,
MANIF_TYPE,
BOL,
SHIP_VIA,
TRLR_NBR,
WORK_ORD_NBR,
CUT_NBR,
MFG_PLNT,
ASN_DTL_TYPE,
ASN_ORGN_TYPE,
RECV_AGAINST_ASN,
BUYER_CODE,
REP_NAME,
PRO_NBR,
DC_ORD_NBR,
CONTRAC_LOCN,
WHSE_XFER_FLAG,
QUAL_CHK_HOLD_UPON_RCPT,
QUAL_AUDIT_PCNT,
CASES_SHPD,
UNITS_SHPD,
CASES_RCVD,
UNITS_RCVD,
SHPD_DATE,
ARRIVAL_DATE_TIME,
FIRST_RCPT_DATE_TIME,
LAST_RCPT_DATE_TIME,
VERF_DATE_TIME,
VOL,
TOTAL_WT,
AUDIT_STAT_CODE,
STAT_CODE,
LABEL_PRT,
REF_CODE_1,
REF_FIELD_1,
REF_CODE_2,
REF_FIELD_2,
REF_CODE_3,
REF_FIELD_3,
MISC_INSTR_CODE_1,
MISC_INSTR_CODE_2,
CREATE_DATE_TIME,
MOD_DATE_TIME,
USER_ID,
CNTRY_OF_ORGN,
IMPORT_VESSEL,
IMPORT_VYGE_NBR,
EXPORT_DATE_TIME,
IMPORT_DATE_TIME,
ZONE_ADMSN_NBR,
PORT_OF_LADING,
PORT_OF_UNLADING,
CUST_SEAL_NBR,
INWARD_FWD_MANIF,
IN_BOND_CARR,
IN_TRANSIT_NBR,
IN_TRANSIT_DATE_TIME,
IN_TRANSIT_PORT,
ZONE_PAYMENT_STAT,
PLAN_ID,
ASN_SHPMT_TYPE,
SUPP_FTZ_ASN,
STOP_SEQ,
TRACTOR_NBR,
INITIATE_FLAG,
ILM_APPT_NBR,
INBD_LOAD_ID,
TMS_SHPMT_NBR,
SOURCE_ID
select
ASN_HDR.SHPMT_NBR,
ASN_HDR.TO_WHSE,
ASN_HDR.MANIF_NBR,
ASN_HDR.MANIF_TYPE,
ASN_HDR.BOL,
ASN_HDR.SHIP_VIA,
ASN_HDR.TRLR_NBR,
ASN_HDR.WORK_ORD_NBR,
ASN_HDR.CUT_NBR,
ASN_HDR.MFG_PLNT,
ASN_HDR.ASN_DTL_TYPE,
ASN_HDR.ASN_ORGN_TYPE,
ASN_HDR.RECV_AGAINST_ASN,
ASN_HDR.BUYER_CODE,
ASN_HDR.REP_NAME,
ASN_HDR.PRO_NBR,
ASN_HDR.DC_ORD_NBR,
ASN_HDR.CONTRAC_LOCN,
ASN_HDR.WHSE_XFER_FLAG,
ASN_HDR.QUAL_CHK_HOLD_UPON_RCPT,
ASN_HDR.QUAL_AUDIT_PCNT,
ASN_HDR.CASES_SHPD,
ASN_HDR.UNITS_SHPD,
ASN_HDR.CASES_RCVD,
ASN_HDR.UNITS_RCVD,
ASN_HDR.SHPD_DATE,
ASN_HDR.ARRIVAL_DATE_TIME,
ASN_HDR.FIRST_RCPT_DATE_TIME,
ASN_HDR.LAST_RCPT_DATE_TIME,
ASN_HDR.VERF_DATE_TIME,
ASN_HDR.VOL,
ASN_HDR.TOTAL_WT,
ASN_HDR.AUDIT_STAT_CODE,
ASN_HDR.STAT_CODE,
ASN_HDR.LABEL_PRT,
ASN_HDR.REF_CODE_1,
ASN_HDR.REF_FIELD_1,
ASN_HDR.REF_CODE_2,
ASN_HDR.REF_FIELD_2,
ASN_HDR.REF_CODE_3,
ASN_HDR.REF_FIELD_3,
ASN_HDR.MISC_INSTR_CODE_1,
ASN_HDR.MISC_INSTR_CODE_2,
ASN_HDR.CREATE_DATE_TIME,
ASN_HDR.MOD_DATE_TIME,
ASN_HDR.USER_ID,
ASN_HDR.CNTRY_OF_ORGN,
ASN_HDR.IMPORT_VESSEL,
ASN_HDR.IMPORT_VYGE_NBR,
ASN_HDR.EXPORT_DATE_TIME,
ASN_HDR.IMPORT_DATE_TIME,
ASN_HDR.ZONE_ADMSN_NBR,
ASN_HDR.PORT_OF_LADING,
ASN_HDR.PORT_OF_UNLADING,
ASN_HDR.CUST_SEAL_NBR,
ASN_HDR.INWARD_FWD_MANIF,
ASN_HDR.IN_BOND_CARR,
ASN_HDR.IN_TRANSIT_NBR,
ASN_HDR.IN_TRANSIT_DATE_TIME,
ASN_HDR.IN_TRANSIT_PORT,
ASN_HDR.ZONE_PAYMENT_STAT,
ASN_HDR.PLAN_ID,
ASN_HDR.ASN_SHPMT_TYPE,
ASN_HDR.SUPP_FTZ_ASN,
ASN_HDR.STOP_SEQ,
ASN_HDR.TRACTOR_NBR,
ASN_HDR.INITIATE_FLAG,
ASN_HDR.ILM_APPT_NBR,
ASN_HDR.INBD_LOAD_ID,
ASN_HDR.TMS_SHPMT_NBR,
#MANHATTAN.VERIFY_SOURCE
from MANHATTANDR.SYN_ASN_HDR ASN_HDR
where (1=1) -
Error loading the data into ODS - Message no. BRAIN060?
Hi,
I am getting following error while load the data from flat file, data loaded successfully from flat file to PSA but I got following error while updating the data from PSA to data target:
Value '010384 javablue' (hex. '30003100300033003800340020006A0061007600610062006C') of characteristic 0PO_NUMBER contains invalid characters
Message no. BRAIN060
Diagnosis
The following standard characters are valid in characteristic values as default:
!"%&''()*+,-./:;<=>?_0123456789ABCDEFGHIJKLMNOPQRSTUVWXYZ
Characteristic values are not allowed if they only consist of the character "#" or begin with "!". If the characteristic is compounded, this also applies to each partial key.
You are trying to load the invalid characteristic value 1. (hexidecimal representation 30003100300033003800340020006A0061007600610062006C).
I am trying to load Value '010384 javablue' to "0PO_NUMBER" info object for ODS with in one row with some other data.
Any idea or any input to resolve this issue?
Thanks in advance for any input.
SteveThanks Soumya, I have maintained the upper case letters, but I am loading upper and lower case mixed to PO number? and it is not working. What is the solution to this? If I set lower case property to the PO number infoobject, it won't accept upper case. If I uncheck the lower case then it won't accept lower case letters. I cannot add upper and lower case letters in RSKC, because it accepts up to 72 characters, I have already have more than 60 characters (special char, numbers and 26 upper case letters).
I have already tried transfer routine but you can either convert to lower or upper but it doesn't work. we need both uper and lower case for po number, and R/3 is accepting it. why BW doesn't accept both?
Any idea what can be done?
Thanks in advance for your help.
Steve -
Help with ORA 14400 error while inserting data
Hi all,
i am facing an ora 14400 error in the following scenario , please help.
i have created a table using the syntax:
CREATE TABLE temp_table
GRPKEY NUMBER(20, 0) NOT NULL,
UKEY NUMBER(10, 0),
ANUM VARCHAR2(250 BYTE),
APC VARCHAR2(2 BYTE),
SID VARCHAR2(65 BYTE),
RDATETIME VARCHAR2(19 BYTE),
CKEY NUMBER(20, 0),
AVER VARCHAR2(25 BYTE),
CVER VARCHAR2(250 BYTE),
TNAME VARCHAR2(50 BYTE),
SCODE VARCHAR2(30 BYTE),
PTYPE VARCHAR2(50 BYTE),
FILENUMB NUMBER(10, 0),
LINENUMB NUMBER(10, 0),
ENTRY_CREATEDDATE DATE
, CONSTRAINT temp_table_PK PRIMARY KEY (GRPKEY))
PARTITION BY RANGE(ENTRY_CREATEDDATE)
(PARTITION P0 VALUES LESS THAN(TO_DATE(' 2009-01-01 00:00:00', 'SYYYY-MM-DD HH24:MI:SS', 'NLS_CALENDAR=GREGORIAN'))
when i try to insert data using :
insert into temp_table values
(1,null,null,null,null,null,null,null,null,null,null,null,null,null,'01-NOV-2010');
i get the following error output:
Error report:
SQL Error: ORA-14400: inserted partition key does not map to any partition
14400. 00000 - "inserted partition key does not map to any partition"
*Cause: An attempt was made to insert a record into, a Range or Composite
Range object, with a concatenated partition key that is beyond
the concatenated partition bound list of the last partition -OR-
An attempt was made to insert a record into a List object with
a partition key that did not match the literal values specified
for any of the partitions.
*Action: Do not insert the key. Or, add a partition capable of accepting
the key, Or add values matching the key to a partition specificationHi Chaitanya,
Change your table script to
CREATE TABLE temp_table
GRPKEY NUMBER(20, 0) NOT NULL,
UKEY NUMBER(10, 0),
ANUM VARCHAR2(250 BYTE),
APC VARCHAR2(2 BYTE),
SID VARCHAR2(65 BYTE),
RDATETIME VARCHAR2(19 BYTE),
CKEY NUMBER(20, 0),
AVER VARCHAR2(25 BYTE),
CVER VARCHAR2(250 BYTE),
TNAME VARCHAR2(50 BYTE),
SCODE VARCHAR2(30 BYTE),
PTYPE VARCHAR2(50 BYTE),
FILENUMB NUMBER(10, 0),
LINENUMB NUMBER(10, 0),
ENTRY_CREATEDDATE DATE
, CONSTRAINT temp_table_PK PRIMARY KEY (GRPKEY))
PARTITION BY RANGE(ENTRY_CREATEDDATE)
(PARTITION P0 VALUES LESS THAN(TO_DATE(' 2009-01-01 00:00:00', 'SYYYY-MM-DD HH24:MI:SS', 'NLS_CALENDAR=GREGORIAN')),
PARTITION P1 VALUES LESS THAN(MAXVALUE)
insert into temp_table values
(1,null,null,null,null,null,null,null,null,null,null,null,null,null,'01-NOV-2010');
1 row insertedor refer question regarding "Date Partitioning a table"
*009*
Edited by: 009 on Nov 3, 2010 11:29 PM -
Error Loading Transactional Data into Cube(0PCA_C01)
Hi Guys,
I am trying to Install the following cubes from Business Content. 0PCA_C01 / 0PCA_C02 (Profit Center Analysis). Everything got replicated. I am trying to load transaction data now. I created Infopackage and loaded the data. Its running for a long time. It still says " not yet completed/warning ". If i try to see the content of the cube, I am getting the following errors.
"Your user master record is not sufficiently maintained for object Authorization Object3.
System error: RSDRC / FORM AUTHORITY_CHECK USER NOT AUTHORIZED 0PCA_C01 0PCA_C01
System error: RSDRC / FUNC RSDRC_BASIC_CUBE_DATA_GET ERROR IN RSDRC_BASIC_QUERY_DATA_GET 0PCA_C01 64
System error: RSDRC / FORM DATA_GET ERROR IN RSDRC_BASIC_CUBE_DATA_GET 0PCA_C01 64"
Also if i try to change something in the infopackage, it says "Init. select. for field name currently running in". I guess its because the job is still running.
Please let me know if i missed something.
RajHi Raj
This seems to be an authorization case.
i guess you are in BW Dev system.
Go to SU01. Enter your user id and display.Go to Tabs ROle and PRofile.
Ideally in development, a developers id shud have access to all devevlopment activities. So check with basis folks and get access to the relevant profiles. If you can get SAP_ALL, then perfect!!
Prakash
Assignin points is a way of saying thanks on SDN!! -
Error loading the data into cube
Hi All,
wen i am loading the data in the ASO Cube its giving warning but i did every thing correct yester day it was loaded correctly buttoday its not loading its giving warning ASO application ignore update to derived cells even i clear the data wats this error
so plz can any one help
Reading Rule SQL Information For Database accdb
Reading Rules From Rule Object For Database accdb
Parallel dataload enabled: [1] block prepare threads, [1] block write threads.
Aggregate storage applications ignore update to derived cells. 29 cells skipped
Data Load Elapsed Time with http://dataldcu.rul : http://0.094 seconds
Database import completed
Output columns prepared: [0]
Thanks
RamIn ASO cubes you can only load data at level 0 (=leaf level). Every upload to aggregated dimension members are ignored. The warning message "ignore update to derived cells" is meaning that you are trying to load data at node level. As this is not possible with ASO cubes, essbase ignores that data.
-
Shared Services 9.3.1 error - "...error loading tree data"
We have recently upgraded to version 9.3.1 and have been using it for several months now.
This past week, when logging in to Shared Services, we are suddenly experiencing a pop up with an error that says "There was a communication error while loading tree data".
Has anyone seen this error, and why would it suddenly start appearing when everything has worked for months and no changes have been applied?
Thanks for any advice, I will post screens and some sections from log files soon...We are also facing the issue while accessing the Hyperion Planning 9.3.1 application from Workspace. But it allows us to access it through planning URL. The workspace classic administration for planning works perfectly fine. Let me know in case you get any solution.
-
ORA-01511:error in renaming data file & ORA-01516: nonexistent data file
Hi all,
DB version is 10.2.0.2 and Applications 12.0.6 on RHEL 4
While creating data file mistakenly we created with '?' as follows:
"/d01/CRP/db/apps_st/data/tx_?data53.dbf"
After that we tried to rename those data file using below steps but we are geting error message as follows:
1.SQL>ALTER TABLESPACE APPS_TS_TX_DATA OFFLINE NORMAL; at DB in OPEN stage
2.$mv /d01/CRP/db/apps_st/data/tx_?data53.dbf /d01/CRP/db/apps_st/data/tx_data53.dbf
3.ALTER TABLESPACE APPS_TS_TX_DATA RENAME DATAFILE '/d01/CRP/db/apps_st/data/tx_?data53.dbf' TO '/d01/CRP/db/apps_st/data/tx_data53.dbf';
ERROR at line 1:
ORA-01511: error in renaming log/data files
ORA-01516: nonexistent log file, datafile, or tempfile
"/d01/CRP/db/apps_st/data/tx_?data53.dbf"
After that we tried to revert back by moving to original name usinf os commnds i.e tx_?data53.dbf and tried to open the database as follows but getting error message as follows:
SQL> startup mount
ORACLE instance started.
Total System Global Area 1073741824 bytes
Fixed Size 1264892 bytes
Variable Size 440402692 bytes
Database Buffers 620756992 bytes
Redo Buffers 11317248 bytes
Database mounted.
SQL> recover database;
Media recovery complete.
SQL> alter database open ;
alter database open
ERROR at line 1:
ORA-01092: ORACLE instance terminated. Disconnection forced
Please help us t oresolve the Issue
Edited by: 912734 on Feb 15, 2012 2:43 AMErrors in alert log file:
ALTER TABLESPACE APPS_TS_TX_DATA RENAME DATAFILE '/d01/CRP/db/apps_st/data/tx_?data53.dbf' TO '/d01/CRP/db/apps_st/data/tx_data53.dbf'
Wed Feb 15 15:38:58 2012
ORA-1525 signalled during: ALTER TABLESPACE APPS_TS_TX_DATA RENAME DATAFILE '/d01/CRP/db/apps_st/data/tx_?data53.dbf' TO '/d01/CRP/db/apps_st/data/tx_data53.dbf'...
Wed Feb 15 15:40:10 2012
ALTER TABLESPACE APPS_TS_TX_DATA RENAME DATAFILE '/d01/CRP/db/apps_st/data/tx_ data53.dbf' TO '/d01/CRP/db/apps_st/data/tx_data53.dbf'
Wed Feb 15 15:40:10 2012
ORA-1525 signalled during: ALTER TABLESPACE APPS_TS_TX_DATA RENAME DATAFILE '/d01/CRP/db/apps_st/data/tx_ data53.dbf' TO '/d01/CRP/db/apps_st/data/tx_data53.dbf'...
ALTER TABLESPACE APPS_TS_TX_DATA ONLINE
Wed Feb 15 15:41:09 2012
Errors in file /d01/CRP/db/tech_st/10.2.0/admin/CRP2_oraapps/bdump/crp2_dbw0_1605.trc:
ORA-01157: cannot identify/lock data file 11 - see DBWR trace file
ORA-01110: data file 11: '/d01/CRP/db/apps_st/data/tx_
data53.dbf'
ORA-27037: unable to obtain file status -
Error loading Master Data ARFCSTATE = SYSFAIL.
Hi All,
When I am trying to load master data 1 data package is not getting loaded, from R/3 the job is getting completed but in BW it is still in Yellow when I checked the job log in R/3 where in that it is showing ARFCSTATE = SYSFAIL.
It shows that from R/3 itself data package has problem.
Can anyone help me in solving this issue.
Regards
BhanuHi,
I will suggest you to check a few places where you can see the status
1) sysfail in SM37 generally means that packet was not send to BW side. But this may not be true in all cases. Rarely We have come acroos situations where you can see sysfail for a particular package in R/3 but will find that packet updated into PSA. Chek this in RSMO > Details.
In case you dont find that packet in BW PSA. Try this.
Go to R/3 > SM37 > goto datacpacket with sysfail. Adjacent to that sysfail message you will a TID no.
Copy that and got to SM58 and search for that in the tRFC pending. There will be one with the same TID or Transact ID. Execute it. The Packet may get transfered to BW.
If it doesnt happen you may have to retrigger the load.
Else Once you identify you can rectify the error.
If all the records are in PSA you can pull it from the PSA to target. Else you may have to pull it again from source infoprovider.
Thanks,
JituK -
Error loading master data RSDMD114
Attempting to load master data - several objects fail with message:
RSDMD114, "The Master Data is deleted in the background"
I went as far as to delete all of the old data and start again.
The object was reactivated.
No locks found on the lock tables.
Still getting the error.
Anyone seen this before ?
THANKS<b>The problem has been solved.</b>
The data deletion job failed - so the control table was still set to show that the deletion run was in progress.
Table <b>RSDCHABASDEL, Status Logging for Master Data Deletion.</b>
Delete the entry from this table, look for the name of the object in question.
Everything ran OK with the entry cleared from the table. -
BPC75NW: Error Loading Master data for BPC Dimensions from BW InfoObject
Hello Gurus,
I'm trying to load master data for BPC Dimensions from BW Infoobjects.
The ID thats used in BW is 32 char long. When I run the load, the ID is truncated after 20 chars and the records are reported as duplicates.
Due to this the load is failing.
I cannot use any other ID as the texts will not be loaded in that case.
Is there any work around to handle this?
I cannot load the transaction data either.
I looked at some posts and blogs in sdn, but nothing really helped.
Cube - 0RPM_c05 (Financial Planning cube in SAP PPM)
Version: BPC 7.5 NW SP5
Thanks,
VasuThanks Everyone.
" you can write a transformation file and give a new name to those IDs who have values more than 20 characters."
Poonam - Could you explain more?
I tried using a additional Dimension property to hold the 32 char ID. But I cannot access this in the transformation.
Is there a way to refer the dimension property in the Transformation file or in the UJD_ROUTINE BADI implementation?
Thanks,
Vasu
Maybe you are looking for
-
Problem with mail for exchange after update on E72...
Hi everyone, I updated Nokia E-mail to ver 3.9 on E72. Now I am facing a problem with mail for exchange. I have configured gmail on mail for exchange. I recieve a warning that "unable to sync contact administrator if problem persist". It started
-
HT4623 I cannot update my apps after upgrade to IOS8
I downloaded and installed IOS8 on IPad and IPhone. Now I cannot update my apps.
-
QuickTime 7.1.1 for Mac
I have a PowerBook G4 running OS 10.3.9. I just upgraded to QuickTime 7.1.1 and now the audio does not work on any quicktime files I try to run. I have checked the volume levels as well as the new A/V feature. Nothing has gotten the audio to work. Cu
-
Re burning a cd or dvd on optical drive
trying to backup photos and files . have tried a cd-rw and a dvd-rw. the drive erased the contents fine but then will not show the blank disc. when try to burn a pop up says: you will need a disc with X amount of free space when try to open disc in f
-
RoboHelp HTML 8 Search problem
I have a single (non-merged) RoboHelp 8 HTML project in which I cannot seem to get Search working correctly. I expected to see a box into which I can type a search term and a blank pane below the Contents/Index/Search/Glossary bar where search result