Error - During Export Grid Data to CSV
OS: Windows XP
Detail: Eror Message Pop-Up During Export Grid Data to CSV. Error Message:
ORA-00972: identifier is too long
Vendor code 972.
There is a bug with all export formats if you are exporting without using aliases and you export a long text string (see More export issues/bugs
This will fail:
select 'testing query execution on export' from dual
This will work:
select 'testing query execution on export' dummy from dual
Similar Messages
-
ERPI Error during Export data to GL for period 2012
Hi, experts
When I write back data from Planning to EBS, I encounter the following error during Export data to GL for period 2012. I have confirmed that Planning data has been upload to ERPI.
Do you have any hints on this error for me? Thanks a lot.
Log:
,COALESCE(ppa.YEARTARGET, pp.YEARTARGET) YEARTARGET
,CASE
WHEN (INSTR(UPPER(wld.TEMP_COLUMN_NAME),'AMOUNT',1) = 1) THEN
CAST(SUBSTR(wld.TEMP_COLUMN_NAME,7,LENGTH(wld.TEMP_COLUMN_NAME)) AS NUMERIC(15,0))
ELSE 0
END ENTITY_NAME_ORDER
FROM (
AIF_WRITEBACK_LOAD_DTLS wld
LEFT OUTER JOIN TPOVPERIODADAPTOR_FLAT_V ppa
ON ppa.INTSYSTEMKEY = 'JC2PLN'
AND ppa.PERIODTARGET = wld.DIMENSION_NAME
AND ppa.YEARTARGET = 'FY12'
) LEFT OUTER JOIN TPOVPERIOD_FLAT_V pp
ON pp.PERIODTARGET = wld.DIMENSION_NAME
AND pp.YEARTARGET = 'FY12'
WHERE wld.LOADID = 176
AND wld.COLUMN_TYPE = 'DATA'
) query
GROUP BY PERIODTARGET
,YEARTARGET
,ENTITY_NAME_ORDER
) q
,TPOVPERIOD p
,AIF_GL_PERIODS_STG prd
WHERE p.PERIODKEY = q.PERIODKEY
AND NOT EXISTS (
SELECT 1
FROM AIF_PROCESS_DETAILS pd
WHERE pd.PROCESS_ID = 176
AND pd.ENTITY_TYPE = 'PROCESS_WB_EXP'
AND pd.ENTITY_ID = prd.YEAR
AND prd.SOURCE_SYSTEM_ID = 3
AND prd.SETID = '0'
AND prd.CALENDAR_ID = '10000'
AND prd.PERIOD_TYPE = 'Month'
AND prd.START_DATE > p.PRIORPERIODKEY
AND prd.START_DATE <= p.PERIODKEY
AND prd.ADJUSTMENT_PERIOD_FLAG = 'N'
ORDER BY prd.YEAR
2012-10-12 11:05:10,078 INFO [AIF]: COMM Writeback Period Processing - Insert Periods into Process Details - END
2012-10-12 11:05:10,516 INFO [AIF]: COMM End Process Detail - Update Process Detail - START
2012-10-12 11:05:10,594 DEBUG [AIF]:
UPDATE AIF_PROCESS_DETAILS
SET STATUS = 'FAILED'
,RECORDS_PROCESSED = CASE
WHEN RECORDS_PROCESSED IS NULL THEN 0
ELSE RECORDS_PROCESSED
END + 0
,EXECUTION_END_TIME = CURRENT_TIMESTAMP
,LAST_UPDATED_BY = CASE
WHEN ('native://DN=cn=911,ou=People,dc=css,dc=hyperion,dc=com?USER' IS NULL) THEN LAST_UPDATED_BY
ELSE 'native://DN=cn=911,ou=People,dc=css,dc=hyperion,dc=com?USER'
END
,LAST_UPDATE_DATE = CURRENT_TIMESTAMP
WHERE PROCESS_ID = 176
AND ENTITY_TYPE = 'PROCESS_WB_EXP'
AND ENTITY_NAME = '2012'
2012-10-12 11:05:10,594 INFO [AIF]: COMM End Process Detail - Update Process Detail - END
2012-10-12 11:05:10,719 INFO [AIF]: ERPI Process End, Process ID: 176I do not have any hints on this.
Would need to see the ODI Operator when the process is ran. -
Error "Encountered an error while exporting the data. View the lof file (DPR-10126)
Hello,
I would like to have sugestions and help to solve a problem in IS 4.2.
I've created a view from two tables, joined together with the key from each one.
Then i've binded the rule with the view (mentioned above) created and calculated score for it.
The calculation is successful, however when i try to export all failed data i have the following error:
"Encountered an error while exporting the data. View the lof file (DPR-10126)"
When exporting failed data from calculations made directly to the datasources i do not encounter any problem, but when i try to export all data to CSV file (not only the 500 results limit) i have this kind of error.
However, when i visualize the data from the view created, it is also possible to export to csv file.
Resuming, the error occurs only when exporting all failed data from the view.
Can anyone help with this issue?
Thx
Best Regards,Hello,
Your information is correct.
I've just confirmed it in the 4.2 SP3 Release Notes and they specifically refer to this issue.
A user who has 'View objects' rights for the project and a failed rule connection cannot export the failed data. This issue has been fixed in this release.
Thank you very much for your input.
Best Regards, -
Export batch data into CSV file using SQL SP
Hi,
I have created WCF-Custom receive adapter to poll Sql SP (WITH xmlnamespaces(DEFAULT 'Namespace' and For XML PATH(''), Type) . Get the result properly in batch while polling and but getting error while converting into CSV by using map.
Please can anyone give me some idea to export SQL data into CSV file using SP.How are you doing this.
You would have got XML representation for the XML batch received from SQL
You should have a flat-file schema representing the CSV file which you want to send.
Map the received XML representation of data from SQL to flat-file schema
have custom pipeline with flat-file assembler on the assembler stage of the send pipeline.
In the send port use the map which convert received XML from SQL to flat file schema and use the above custom flat-file disassembler send port
If this answers your question please mark it accordingly. If this post is helpful, please vote as helpful by clicking the upward arrow mark next to my reply. -
DRM Batch Client error during export
Hello,
I am experiencing an error when using the DRM Batch Client to execute export books. The same export process fails sometimes and runs to completion successfully at other times. When it fails, it does not always fail in the same place within the export process, and the error is as follows:
2/2/2011 8:31:40 PM - ERemotableException with message: "Server was unable to process request. ---> Error during Export. Export was unable to run. Error: Timeout expired" while running Export Book
2/2/2011 8:31:40 PM - => ERROR: Data Relationship Management Server returned error: "Server was unable to process request. ---> Error during Export. Export was unable to run. Error: Timeout expired."
2/2/2011 8:31:40 PM - *** MDMConnect stopping ***
Because this issue is sporadic, it seems like there could be a few possible causes:
1. Network issues
2. SQL connection/time-out issues -- These particular exports are writing to SQL tables versus flat files
3. Am wondering if there are any known issues with the MDMConnect? Or are there time-out settings we can adjust?
We currently have our in-house IT chasing down any possible network or SQL issues. Does anyone know of any issues with the batch client, or have ideas on any other possible causes? Any help is greatly appreciated. Thanks!
EJThere is a timeout setting in DRM system preferences and there are mulitlple session timeout settings in IIS that you may need to configure. Try the system preference first. For very large jobs, ie. hundreds of thousands of nodes with hundreds of properties per node, you'll need to update settings in IIS as well.
-
Materialized View with "error in exporting/importing data"
My system is a 10g R2 on AIX (dev). When I impdp a dmp from other box, also 10g R2, in the dump log file, there is an error about the materialized view:ORA-31693: Table data object "BAANDB"."MV_EMPHORA" failed to load/unload and is being skipped due to error:
ORA-02354: error in exporting/importing data
Desc mv_emphora
Name Null? Type
C_RID ROWID
P_RID ROWID
T$CWOC NOT NULL CHAR(6)
T$EMNO NOT NULL CHAR(6)
T$NAMA NOT NULL CHAR(35)
T$EDTE NOT NULL DATE
T$PERI NUMBER
T$QUAN NUMBER
T$YEAR NUMBER
T$RGDT DATEAs i ckecked here and Metalink, I found the info is less to do with the MV? what was the cause?The total lines are 25074. So I used the GREP from the OS to get the lines involved with the MV. Here are:
grep -n -i "TTPPPC235201" impBaanFull.log
5220:ORA-39153: Table "BAANDB"."TTPPPC235201" exists and has been truncated. Data will be loaded but all dependent metadata will be skipped due to table_exists_action of truncate
5845:ORA-39153: Table "BAANDB"."MLOG$_TTPPPC235201" exists and has been truncated. Data will be loaded but all dependent meta data will be skipped due to table_exists_action of truncate
8503:. . imported "BAANDB"."TTPPPC235201" 36.22 MB 107912 rows
8910:. . imported "BAANDB"."MLOG$_TTPPPC235201" 413.0 KB 6848 rows
grep -n -i "TTCCOM001201" impBaanFull.log
4018:ORA-39153: Table "BAANDB"."TTCCOM001201" exists and has been truncated. Data will be loaded but all dependent metadata will be skipped due to table_exists_action of truncate
5844:ORA-39153: Table "BAANDB"."MLOG$_TTCCOM001201" exists and has been truncated. Data will be loaded but all dependent metadata will be skipped due to table_exists_action of truncate
9129:. . imported "BAANDB"."MLOG$_TTCCOM001201" 9.718 KB 38 rows
9136:. . imported "BAANDB"."TTCCOM001201" 85.91 KB 239 rows
grep -n -i "MV_EMPHORA" impBaanFull.log
8469:ORA-39153: Table "BAANDB"."MV_EMPHORA" exists and has been truncated. Data will be loaded but all dependent metadata will be skipped due to table_exists_action of truncate
8558:ORA-31693: Table data object "BAANDB"."MV_EMPHORA" failed to load/unload and is being skipped due to error:
8560:ORA-12081: update operation not allowed on table "BAANDB"."MV_EMPHORA"
25066:ORA-31684: Object type MATERIALIZED_VIEW:"BAANDB"."MV_EMPHORA" already exists
25072: BEGIN dbms_refresh.make('"BAANDB"."MV_EMPHORA"',list=>null,next_date=>null,interval=>null,implicit_destroy=>TRUE,lax=>
FALSE,job=>44,rollback_seg=>NULL,push_deferred_rpc=>TRUE,refresh_after_errors=>FALSE,purge_option => 1,parallelism => 0,heap_size => 0);
25073:dbms_refresh.add(name=>'"BAANDB"."MV_EMPHORA"',list=>'"BAANDB"."MV_EMPHORA"',siteid=>0,export_db=>'BAAN'); END;the number in front of each line is the line number of the import log.
Here is my syntax of import pmup:impdp user/pw SCHEMAS=baandb DIRECTORY=baanbk_data_pump DUMPFILE=impBaanAll.dmp LOGFILE=impBaanAll.log TABLE_EXISTS_ACTION=TRUNCATEYes I can create the MV manually and I have no problem to refresh manually it after the inmport. -
OMS error during install Grid control 10gR2 on Windows 2003
He,
I get this error during install Grid control 10gR2 on a new server win2003.
"oracle.sysman.emcp.agent.AgentPlugIn"
What I see in forums is that I have to stop my listener and continue my installation.
But I still have the same error !!
Can you please help?
Many thanksThere are some guidelines here:
Subject: OracleAS Install Error When Configuring AS Console Agent - "oracle.sysman.emcp.agent.AgentPlugIn has failed"
Doc ID: Note:386308.1
The solution stated is "Check the contents of the hosts of file and ensure that every entry is correct. All hostname / ip addresses have to be valid."
When installing AS 9i R2 there was an issue with the names, those were supposed not to hava a hypen, so may be something was inherited from those kernels.
There is another metalink note that document hosts issues:
Problem: AgentCA: Grid Control Release 2 install fails with java.lang.Exception: 6 during Agent Configuration Assistant
Doc ID: Note:358640.1
~ Madrid. -
Error during export of J2EE_CONFIGENTRY
Hi,
While doing homogeneous system copy for CRM 2007 System it gives Error during export of J2EE_CONFIGENTRY.
below is the detail error report.
15.04.09 14:01:29 com.sap.inst.jload.Jload dbExport
SEVERE: Error during export of J2EE_CONFIGENTRY
15.04.09 14:01:29 com.sap.inst.jload.Jload logStackTrace
SEVERE: java.lang.NullPointerException
at com.sap.sql.jdbc.mss.MssSQLExceptionAnalyzer.getCategory(MssSQLExceptionAnalyzer.java:112)
at com.sap.sql.services.core.CoreServices.isDuplicateKeyException(CoreServices.java:133)
at com.sap.sql.jdbc.direct.DirectPreparedStatement.processSQLException(DirectPreparedStatement.java:1189)
at com.sap.sql.jdbc.direct.DirectPreparedStatement.executeQuery(DirectPreparedStatement.java:285)
at com.sap.sql.jdbc.common.CommonPreparedStatement.executeQuery(CommonPreparedStatement.java:185)
at com.sap.inst.jload.db.DBTable.unload(DBTable.java:192)
at com.sap.inst.jload.Jload.dbExport(Jload.java:179)
at com.sap.inst.jload.Jload.executeJob(Jload.java:395)
at com.sap.inst.jload.Jload.main(Jload.java:621)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
at java.lang.reflect.Method.invoke(Method.java:324)
at com.sap.engine.offline.OfflineToolStart.main(OfflineToolStart.java:81)
15.04.09 14:01:29 com.sap.inst.jload.db.DBConnection disconnect
INFO: disconnected
Thanks & Regards
Raj KiranHi,
Search for sap notes on "J2EE_CONFIGENTRY"
service.sap.com/notes
You can get a better solution there.
Thanks and Regards
Jaianandh V -
Error during export of xi-objects
Hi,
I get an strange error when trying to export objects from the repository and directory using "Transport Using File System". The error says "Error during export".
First we got the error "Internal error during pvc call: No space left on device" but this error disappeared without any actions from us and there is space left on the server.
Please help
Regards
ClaesHi VJ,
It only says "Error during export".
/C -
[Error during Export] R3load exited with return code 11
Hi guys,
While performing and Unicode Migration, during Export Phase, Migration Monitor returned the following error while exporting a package
ERROR: 2014-06-01 10:57:44 com.sap.inst.migmon.LoadTask run
Unloading of 'REGUC' export package is interrupted with R3load error.
Process '/usr/sap/ECP/DVEBMGS00/exe/R3load -e REGUC.cmd -datacodepage 4102 -l REGUC.log -stop_on_error' exited with return code 11.
For mode details see 'REGUC.log' file.
Standard error output:
sapparam: sapargv(argc, argv) has not been called!
sapparam(1c): No Profile used.
sapparam: SAPSYSTEMNAME neither in Profile nor in Commandline
REGUC.log details:
(As you see, it does not provide reasons of this error, all messages are informative and alike all other Packages logs)
/usr/sap/ECP/DVEBMGS00/exe/R3load: START OF LOG: 20140601105052
/usr/sap/ECP/DVEBMGS00/exe/R3load: sccsid @(#) $Id: //bas/740_REL/src/R3ld/R3load/R3ldmain.c#4 $ SAP
/usr/sap/ECP/DVEBMGS00/exe/R3load: version R7.40/V1.8
Compiled Jul 23 2013 21:30:53
/usr/sap/ECP/DVEBMGS00/exe/R3load -e REGUC.cmd -datacodepage 4102 -l REGUC.log -stop_on_error
(DB) INFO: connected to DB
(DB) INFO: DbSlControl(DBSL_CMD_NLS_CHARACTERSET_GET): US7ASCII
(EXP) INFO: check NameTab widths: Ok.
(DB) INFO: Export without hintfile
(RSCP) INFO: UMGCOMCHAR read check, OK.
(RSCP) INFO: environment variable "I18N_POOL_WIDTH" is not set. Checks are active.
(RSCP) INFO: I18N_NAMETAB_TIMESTAMPS not in env: checks are ON (Note 738858)
(RSCP) INFO: UMGSETTINGS nametab creation: ok.
(RSCP) INFO: Global fallback code page = 1160
(RSCP) INFO: Common character set is not 7-bit-ASCII
(RSCP) INFO: Collision resolution method is 'fine'
(RSCP) INFO: R3trans code pages = Normal
(RSCP) INFO: EXPORT TO ... code pages = Normal
(RSCP) INFO: Check for invalid language keys: activated
(RSCP) INFO: I18N_NAMETAB_NORM_ALLOW = 999999999
(RSCP) INFO: I18N_NAMETAB_NORM_LOG = 1000000002
(RSCP) INFO: I18N_NAMETAB_ALT_ALLOW = 10000
(RSCP) INFO: I18N_NAMETAB_ALT_LOG = 10003
(RSCP) INFO: I18N_NAMETAB_OLD_ALLOW = 0
(RSCP) INFO: I18N_NAMETAB_OLD_LOG = 500
(RSCP) INFO: init SUMG interface (3#$Revision: #1 $); package name: "REGUC".
(RSCP) INFO: "REGUC001.xml" created.
(GSI) INFO: dbname = "ECP20030615080638
(GSI) INFO: vname = "ORACLE "
(GSI) INFO: hostname = "ecp "
(GSI) INFO: sysname = "SunOS"
(GSI) INFO: nodename = "ecp"
(GSI) INFO: release = "5.10"
(GSI) INFO: version = "Generic_142900-13"
(GSI) INFO: machine = "sun4u"
Tried to find what does R3load Return Code 11 means, but I couldnt find further information on this.
Any hint of what is causing this error?
(Please disregard any Filesystem issues since all of them have available space)
Best!After using latest R3load patch, it provides furher information on the log file.
-------------------- Start of patch information ------------------------
patchinfo (patches.h): (0.070) R3ldctl: fix CDS views creation in upgrade export mode (note 2
017805)
DBSL patchinfo (patches.h): (0.022) Smaller corrections in release 7.40 (th, dp, vmc, dpmon)
(note 1821404)
--------------------- End of patch information -------------------------
process id 27326
(DB) INFO: connected to DB
(DB) INFO: DbSlControl(DBSL_CMD_NLS_CHARACTERSET_GET): US7ASCII
(EXP) INFO: check NameTab widths: Ok.
(DB) INFO: Export without hintfile
(RSCP) INFO: UMGCOMCHAR read check, OK.
(RSCP) INFO: environment variable "I18N_POOL_WIDTH" is not set. Checks are active.
(RSCP) INFO: I18N_NAMETAB_TIMESTAMPS not in env: checks are ON (Note 738858)
(RSCP) INFO: UMGSETTINGS nametab creation: ok.
(RSCP) INFO: Global fallback code page = 1160
(RSCP) INFO: Common character set is not 7-bit-ASCII
(RSCP) INFO: Collision resolution method is 'fine'
(RSCP) INFO: R3trans code pages = Normal
(RSCP) INFO: EXPORT TO ... code pages = Normal
(RSCP) INFO: Check for invalid language keys: activated
(RSCP) INFO: I18N_NAMETAB_NORM_ALLOW = 999999999
(RSCP) INFO: I18N_NAMETAB_NORM_LOG = 1000000002
(RSCP) INFO: I18N_NAMETAB_ALT_ALLOW = 10000
(RSCP) INFO: I18N_NAMETAB_ALT_LOG = 10003
(RSCP) INFO: I18N_NAMETAB_OLD_ALLOW = 0
(RSCP) INFO: I18N_NAMETAB_OLD_LOG = 500
(RSCP) INFO: init SUMG interface (3#$Revision: #1 $); package name: "REGUC".
(RSCP) INFO: "REGUC003.xml" created.
(GSI) INFO: dbname = "ECP20030615080638
(GSI) INFO: vname = "ORACLE "
(GSI) INFO: hostname = "ecp "
(GSI) INFO: sysname = "SunOS"
(GSI) INFO: nodename = "ecp"
(GSI) INFO: release = "5.10"
(GSI) INFO: version = "Generic_142900-13"
(GSI) INFO: machine = "sun4u"
(GSI) INFO: instno = "6120023288"
(RTF) ########## WARNING ###########
Without ORDER BY PRIMARY KEY the exported data may be unusable for some databases
(RDI) WARNING: /respaldo/ECP_Export/ABAP/DATA/REGUC.STR: cannot find version token "ver:" at line 1
(RDI) WARNING: /respaldo/ECP_Export/ABAP/DATA/REGUC.STR: unknown file format, assuming version 2
(EXP) INFO: table REGUC will be exported with sorting (task modifier)
------------------ C-STACK ----------------------
[0] t_splay ( 0x1020df850, 0xb3, 0x4386, 0x100a02078, 0x0, 0x1020df870 ), at 0xffffffff7b9629
94
[1] t_delete ( 0x1020df850, 0x4348, 0x1d9870, 0x1000d0408, 0xffffffff7bb3c000, 0x0 ), at 0xff
ffffff7b9627e4
[2] realfree ( 0x101ee7590, 0x1f82b1, 0x1d9cd0, 0x1f82b0, 0xffffffff7bb3c000, 0x101ee7590 ),
at 0xffffffff7b9623b0
[3] _malloc_unlocked ( 0x7d40, 0x200000, 0x101edf840, 0x0, 0x101ee7580, 0x0 ), at 0xffffffff7
b961eb4
[4] malloc ( 0x7d39, 0x2388, 0x1da428, 0x10038fe84, 0xffffffff7bb3c000, 0x2000 ), at 0xffffff
ff7b961c28
[5] rstg_get ( 0x2, 0x0, 0x2, 0x7d10, 0x0, 0x10077d8c0 ), at 0x10038ff4c
[6] c3_uc_new_cache ( 0x100a02078, 0x284db0, 0x284c00, 0x10077d8c0, 0x100a02108, 0xb3 ), at 0
x1000d1ce8
[7] c3_uc_seek_cache ( 0xb3, 0xff79, 0xffffffff7ffcae3e, 0x100a02078, 0x10189fa48, 0x10077d8c
0 ), at 0x1000d188c
[8] c3_uc_new_tabcache ( 0x0, 0xb3, 0x4386, 0x100a02078, 0xb3, 0x10077d8c0 ), at 0x1000d13fc
[9] c3_uc_insert_record_in_cache ( 0xffffffff7ffcb04e, 0x10189fa48, 0x100a02078, 0x100a024c8,
0xb3, 0x1058c8f84 ), at 0x1000d0408
[10] c3_uc_insert ( 0x0, 0x100a02078, 0x0, 0x0, 0x10077d8c0, 0xffffffff7ffcb04e ), at 0x1000c
f944
[11] c3_uc_convert_table_entry ( 0x1015c61f8, 0x0, 0x100a02078, 0x0, 0x10189f408, 0x31a000 ),
at 0x1000ce314
[12] c3_uc_convert_logic_table ( 0x1015c61f8, 0x100a02078, 0x0, 0x4a, 0x100638dc0, 0x10077d8c
0 ), at 0x1000cdb18
[13] c3_uc_convert_cluster_data ( 0x100a02078, 0xffffffff7ffcbc50, 0x1015c6428, 0x1015c61f8,
0x10077d8c0, 0x0 ), at 0x1000cd10c
[14] c3_uc_convert_cluster_item ( 0xffffffff7ffcbc78, 0xffffffff7ffcbc50, 0x100954770, 0x1038
6f6e0, 0xffffffff7ffcbb8c, 0x100a02812 ), at 0x1000be1d4
[15] process_task ( 0xffffffff7ffcbc78, 0xffffffff7ffcbc50, 0x100954770, 0x100a02810, 0x10063
32e0, 0x10077d8c0 ), at 0x1000b2f58
[16] CnvCluster ( 0x1014642d0, 0x100954670, 0x100aa9610, 0x2849b8, 0x284800, 0x6cb978 ), at 0
x1000b1f8c
[17] cnv_clust2 ( 0x100aa9580, 0xffffffff7ffcc258, 0x100954770, 0xfc0, 0x107d00, 0x100954670
), at 0x100062714
[18] cnv_cp ( 0x100aa9580, 0xffffffff7ffcc258, 0x100aa95e0, 0x0, 0x10077d8c0, 0x1d6c00 ), at
0x100064644
[19] process_buffer ( 0x100aa9580, 0xffffffff7ffcc258, 0xfff00, 0x100a97bd0, 0x10, 0x100aa95b
0 ), at 0x10007162c
[20] execute_table_unload ( 0xffffffff7ffed610, 0xffffffff7ffdc4b0, 0x0, 0x100aa9580, 0xfffff
fff7ffcc258, 0x10077d8c0 ), at 0x10006c488
[21] export_table_task ( 0xffffffff7ffed610, 0x0, 0xffffffff7ffdc368, 0xffffffff7ffcc330, 0xf
fffffff7ffeedb8, 0xffffffff7ffeec60 ), at 0x10006c934
[22] db_unload ( 0x100a97b0d, 0xffffffff7ffed610, 0x0, 0x10077d8c0, 0xffffffff7ffeec60, 0xfff
fffff7bb47540 ), at 0x10006d148
[23] main_r3ldmain ( 0x72ec00, 0xffffffff7ffff448, 0xffffffff7ffff228, 0xffffffff7bb47540, 0x
1005fc290, 0x0 ), at 0x1000514e0 -
Error during export & import of rules in RAR 5.2
Hi all,
We followed the steps mentioned in exporting & importing the rules as per the Config guide. But we
are receiving the below error during the import of rules. Can anybody please throw a limelight on why this error message is appearing?
"Error in Table Data ==> VIRSA_CC_CRPROF
SQL==>Insert into VIRSA_CC_CRPROF(VSYSKEY,PROFILE,RULESETID,RISKLEVEL,STATUS)Values(??????)
Record==>D VIRSA_CC_CRPROF null"
We also ensured that the downloaded file is not truncated and saved it in a separate folder. Does the file needs to saved in ANSI or Unicode text? We saved in ANSI
Also the background ground job is not getting scheduled automatically during the import of ruels. Is that due to the above error?
Thanks and Best Regards,
Srihari.KHello Sri,
background job for generating the rules won`t be scheduled before you upload the file successfully.
The most obvious reason for this error message is that you have line in your file - in table VIRSA_CC_CRPROF that is corrupted. Make sure all lines from table VIRSA_CC_CRPROF have all predefined fields - (VSYSKEY,PROFILE,RULESETID,RISKLEVEL,STATUS).
If you keep hiting this problem just delete this table, after the upload you`ll add the critical profiles manually - I bet you don`t have many of them.
Also from my experience I always choose to save the downloaded file in UNICODE UTF-8 format.
Once the file is saved in other format, there`s no use of it, download it again and save directly in Unicode.
Make sure you don`t have empty fields, even in the fields where you don`t have values, you must leave space, otherwise you`ll keep hitting the same issue.
Regards,
Iliya -
Essbase Version 11 Error During Export
Hi,
I have just installed Hyperion Essbase version 11 on windows 2003 Server, It’s a fresh install on new machine. I am getting following error during nightly backup.
After "Calc All" we export data to the following location. Hyperion\APP\DB. Same Maxl script works fine on version 7 of Essbase. If i export data manually its works fine, only issue i have during nightly back (Automated). Data size is 1.4 GB; Please let me know if you need some more information.
Error.. *ERROR - 1005000 - Ascii Backup: Failed to open [\PLAN\TEST\TEST_Export_AllData_1.txt]..*
Maxl Script:
MAXL> export database "PLAN"."TEST" all data to data_file
2> "'\\PLAN\\TEST\\TEST_Export_AllData_1.txt'",
3> "'\\PLAN\\TEST\\TEST_Export_AllData_2.txt'";
Error:
OK/INFO - 1019020 - Writing Free Space Information For Database [TEST].
OK/INFO - 1005029 - Parallel export enabled: Number of export threads [2].
*ERROR - 1005000 - Ascii Backup: Failed to open [\PLAN\TEST\TEST_Export_AllData_1.txt]..*
MaxL Shell completed with error
Thank you,
T.KhanI tried with other path and it works. I really appreciate all your help. I used following maxl
export database "Plan"."Test" all data to data_file
C:\Hyperion\products\Essbase\EssbaseServer\app\PLAN\TEST\Export_AllDtata_1.txt,
C:\Hyperion\products\Essbase\EssbaseServer\app\PLAN\TEST\Export_AllDtata_2.txt;
What changes I need to make on my maxl. Do I need to make any changes on Variables? Here is my current Maxl
/* Export Database */
/* Environment Variables: (set by calling script) */
/* $ess_UserName - Essbase username */
/* $ess_Password - Essbase password */
/* $ess_Server - Essbase server */
/* Local variables: (set by MaxL script) */
/* $APP - Essbase application ($1 positional parameter) */
/* $DB - Essbase database ($2 positional parameter) */
/* Any error returned from a MaxL command causes the */
/* logic to branch to an error exit at end of the script */
/* set local variables to values of positional parameters */
set APP=$1;
set DB=$2;
/* Login to Essbase */
login $ess_UserName $ess_Password on $ess_Server;
/* Export database in parallel to the database directory - all data */
export database "$APP"."$DB" all data to data_file
"'\\$APP\\$DB\\$(DB)_Export_AllData_1.txt'",
"'\\$APP\\$DB\\$(DB)_Export_AllData_2.txt'";
iferror 'ERROREXIT';
/* Normal Exit */
Thank You
T.Khan -
Getting error in Exporting the data from legacy system to Tables
Hi,
I am trying to export the data. But am getting the below error.
Unknown system exception - java.security.PrivilegedActionException: javax.security.auth.login.LoginException: java.net.ConnectException: t3://abc.com:7001: Bootstrap to abc.jdadelivers.com/0:0:0:0:0:0:0:1:7001 failed. It is likely that the remote side declared peer gone on this JVM.
No records are exported.
Include column headers: Yes
Format : Seperated (,) , enclosing character (")
{code}
I tried with with different user but getting the same error. Please have a look and let us know the route cause of the issue.
Many thanks in advance.
Vas.Export ? What kind of export is this ? I see a weblogic port... Can you provide more information ?
-
Error while load the data from CSV with CTL file..?
Hi TOM,
When i try to load data from CSV file to this table,
CTL File content:
load data
into table XXXX append
Y_aca position char (3),
x_date position date 'yyyy/mm/dd'
NULLIF (x_date = ' '),
X_aca position (* + 3) char (6)
"case when :Y_aca = 'ABCDDD' and :XM_dt is null then
decode(:X_aca,'AB','BA','CD',
'DC','EF','FE','GH','HG',:X_aca)
else :X_aca
end as X_aca",
Z_cdd position char (2),
XM_dt position date 'yyyy/mm/dd'
NULLIF XM_dt = ' ',
When I try the above CTL file; geting the following error..
SQL*Loader-281: Warning: ROWS parameter ignored in parallel mode.
SQL*Loader-951: Error calling once/load initialization
ORA-02373: Error parsing insert statement for table "XYZ"."XXXX".
ORA-00917: missing commaPossible Solutions
Make sure that the data source is valid.
Is a member from each dimension specified correctly in the data source or rules file?
Is the numeric data field at the end of the record? If not, move the numeric data field in the data source or move the numeric data field in the rules file.
Are all members that might contain numbers (such as "100") enclosed in quotation marks in the data source?
If you are using a header, is the header set up correctly? Remember that you can add missing dimension names to the header.
Does the data source contain extra spaces or tabs?
Has the updated outline been saved? -
Exporting Grid data to Flex...
How can i export my grid data to excel on right click of grid?
You can do it easily in AIR2:
http://coenraets.org/blog/2009/11/open-in-excel-another-air-2-mini-sample/
Thanks,
Oleg.
Maybe you are looking for
-
How can I calculate the sum of the first two rows in a table?
I have a table with a repeating row that, when data is merged, always has three rows: Support type 1 1,234,456 Support type 2 221,556 Interest 11.222 I have a field where I want to show the sum of Support types 1 and 2. I do not want to add the inter
-
i need help! i am unable to use my ipod for anyreason due to the update
-
After reading the help manual, I don't see any info on changing a stereo audio clip to mono once it is already in the timeline. I am probably missing something, I hope. Any help? Thanks
-
How can I get information about user's geolocalization and time of connection from my iTunesU Public Page? I need to have a much more complete and clear statistic report. Thank you
-
hi guys presently when i login with user name and password i get blank dashboard page with this is an empty dashoard page.. instead of this page i have created one html page and want to know where i need to put my html page so that BI server should b