Error while binding data into Node which is Created Dynamically.
Hi All
We have a scenario , which is to create the Context and its attribute dynamically at runtime. When I create the context with structure. Using the method <b>create_nodeinfo_from_struct()</b>.here the attributes under the created node is automatically taken from the provided structure SPFLI.
cl_wd_dynamic_tool=>create_nodeinfo_from_struct(
parent_info = rootnode_info
node_name = DYN
structure_name = SPFLI
is_multiple = abap_true ).
Using the above method it is working fine and I could bind the value into the node.
But our scenario is to create context and its attributes without giving any predefined structure. We need to add attributes manually giving one by one.
For that I have used the method rootnode_info->add_new_child_node &
dyn_node->add_attribute( exporting attribute_info = ls_attribute ).
But am getting short dumb showing <b>Line types of an internal table and a work area not compatible.</b>
Please give me a solution..
I used the following code
.
data:
rootnode_info type ref to if_wd_context_node_info,
dyn_node type ref to if_wd_context_node,
data :ls_attribute type wdr_context_attribute_info.
rootnode_info = wd_context->get_node_info( ).
call method rootnode_info->add_new_child_node
exporting
SUPPLY_METHOD =
SUPPLY_OBJECT =
DISPOSE_METHODS =
DISPOSE_OBJECT =
STATIC_ELEMENT_TYPE =
name = 'DYN'
is_mandatory = abap_false
is_mandatory_selection = abap_false
is_multiple = abap_true
is_multiple_selection = abap_false
is_singleton = abap_true
is_initialize_lead_selection = abap_true
STATIC_ELEMENT_RTTI =
is_static = abap_false
ATTRIBUTES =
receiving
child_node_info = node_nsp
ls_attribute-name = 'CARRID'.
ls_attribute-type_name = 'S_CARR_ID'.
ls_attribute-value_help_mode = '0'.
node_nsp->add_attribute( exporting attribute_info = ls_attribute ).
ls_attribute-name = 'CONNID'.
ls_attribute-type_name = 'S_CONN_ID'.
ls_attribute-value_help_mode = '0'.
node_nsp->add_attribute( exporting attribute_info = ls_attribute ).
DATA : BEGIN OF str,
carrid TYPE s_carr_id,
connid TYPE s_conn_id,
END OF str.
DATA : it_str LIKE TABLE OF str.
select carrid connid from spfli into corresponding fields of table it_str
dyn_node = wd_context->get_child_node( name = 'DYN' ).
dyn_node->bind_table( it_str ).
data: l_ref_cmp_usage type ref to if_wd_component_usage.
l_ref_cmp_usage = wd_this->wd_cpuse_alv_usage( ).
if l_ref_cmp_usage->has_active_component( ) is initial.
l_ref_cmp_usage->create_component( ).
endif.
data: l_ref_interfacecontroller type ref to iwci_salv_wd_table .
l_ref_interfacecontroller = wd_this->wd_cpifc_alv_usage( ).
l_ref_interfacecontroller->set_data(
r_node_data = dyn_node " Ref to if_Wd_Context_Node
Thanks in advance..
Nojish
Message was edited by:
nojish p
Hi Nojish.
I gues the error happens when you try to read the static attributes of an element of
you dynamic node. This does not work using your defined type str, cause with
get_static_attribute you only receive the static and not the dynamic attributes.
Static attributes are attributes that are defined at design time. Thats why the
structure does not match.
But you can use the following code to get your dynamic attributes, for example
getting the lead selection element and the attribute carrid:
dyn_node = wd_context->get_child_node( name = 'DYN' ).
lr_elelement = dyn_node->get_element( ).
lr_el->get_attribute(
EXPORTING
name = 'CARRID'
IMPORTING
value = lv_carrid
Hope this helps.
PS: Or do you get the reerror in above code? Your code works here. If so pls provide short dump.
Cheers,
Sascha
Message was edited by:
Sascha Dingeldey
Similar Messages
-
Error while Inserting data into flow table
Hi All,
I am very new to ODI, I am facing lot of problem in my 1st interface. So I have many questions here, please forgive me if it has irritated to you.
========================
I am developing a simple Project to load a data from an input source file (csv) file into a staging table.
My plan is to achieve this in 3 interfaces:
1. Interface-1 : Load the data from an input source (csv) file into a staging table (say Stg_1)
2. Interface-2 : Read the data from the staging table (stg_1) apply the business rules to it and copy the processed records into another staging table (say stg_2)
3. Interface-3 : Copy the data from staging table (stg_2) into the target table (say Target) in the target database.
Question-1 : Is this approach correct?
========================
I don't have any key columns in the staging table (stg_1). When I tried to execute the Flow Control of this I got an error:
Flow Control not possible if no Key is declared in your Target Datastore
With one of the response (the response was - "FLOW control requires a KEY in the target table") in this Forum I have introduced a column called "Record_ID" and made it a Primary Key column into my staging table (stg_1) and my problem has been resolved.
Question-2 : Is a Key column compulsary in the target table? I am working in BO Data Integrator, there is no such compulsion ... I am little confused.
========================
Next, I have defined one Project level sequence. I have mapped the newly introduced key column Record_Id (Primary Key) with the Project level sequence. Now I am got another error of "CKM not selected".
For this, I have inserted "Insert Check (CKM)" knowledge module in my Project. With this the above problem of "CKM not selected" has been resolved.
Question-3 : When is this CKM knowledge module required?
========================
After this, the flow/interface is failing while loading data into the intermediar ODI created flow table (I$)
1 - Loading - SS_0 - Drop work table
2 - Loading - SS_0 - Create work table
3 - Loading - SS_0 - Load data
5 - Integration - FTE Actual data to Staging table - Drop flow table
6 - Integration - FTE Actual data to Staging table - Create flow table I$
7 - Integration - FTE Actual data to Staging table - Delete target table
8 - Integration - FTE Actual data to Staging table - Insert flow into I$ table
The Error is at Step-8 above. When opened the "Execution" tab for this step I found the message - "Missing parameter Project_1.FTE_Actual_Data_seq_NEXTVAL RECORD_ID".
Question-4 : What/why is this error? Did I made any mistake while creating a sequence?Everyone is new and starts somewhere. And the community is there to help you.
1.) What is the idea of moving data from stg_1 and then to stg_2 ? Do you really need it for any other purpose other than move data from SourceFile to Target DB.
Otherwise, its simple to move data from SourceFile -> Target Table
2.) Does your Target table have a Key ?
3.) CKM (Check KM) is required when you want to do constraint validation (Checking) on your data. You can define constraints (business rules) on the target table and Flow Control will check the data that is flowing from Source File to Target table using the CKM. All the records that donot satisfy the constraint will be added to E$ (Error table) and will not be added to the Target table.
4.) Try to avoid ODI sequences. They are slow and arent scalable. Try to use Database sequence wherever possible. And use the DB sequence is target mapping as
<%=odiRef.getObjectName( "L" , "MY_DB_Sequence_Row" , "D" )%>.nextval
where MY_DB_Sequence_Row is the oracle sequence in the target schema.
HTH -
Error while loading data into External table from the flat files
HI ,
We have a data load in our project which feeds the oracle external tables with the data from the Flat Files(.bcp files) in unix.
While loading the data, we are encountering the following error.
Error occured (Error Code : -29913 and Error Message : ORA-29913: error in executing ODCIEXTTABLEOPEN callout
ORA-29400: data cartridge error
KUP-04063: un) while loading data into table_ext
Please let us know what needs to be done in this case to solve this problem.
Thanks,
KartheekKartheek,
I used Google (mine still works).... please check those links:
http://oraclequirks.blogspot.com/2008/07/ora-29400-data-cartridge-error.html
http://jonathanlewis.wordpress.com/2011/02/15/ora-29913/
HTH,
Thierry -
Getting error while loading Data into ASO cube by flat file.
Hi All,
i am getting this error Essbase error 1270040: Data load buffer[1] does not exist while loading data into ASO cube.
does anyone have solution.
Regards,
VMAre you using ODI to load the data or maxl? If you are using an ODI interface, are you using a load rule also which version of essbase and ODI are you using
Cheers
John
http://john-goodwin.blogspot.com/ -
Error while loading data into clob data type.
Hi,
I have created interface to load data from oracle table into oracle table.In target table we have attribute with clob data type. while loading data into clob field ODI gave below error. I use odi 10.1.3.6.0
java.lang.NumberFormatException: For input string: "4294967295"
at java.lang.NumberFormatException.forInputString(Unknown Source)
at java.lang.Integer.parseInt(Unknown Source)
at java.lang.Integer.parseInt(Unknown Source)
Let me know if anyone come across and resolved this kind of issue.
Thanks much,
Nishit GajjarMr. Gajjar,
You didnt mention what KMs you are using ?
have a read of
Re: Facing issues while using BLOB
and
Load BLOB column in Oracle to Image column in MS SQL Server
Try again.
And can you please mark the Correct/Helpful points to the answers too.
Edited by: actdi on Jan 10, 2012 10:45 AM -
Error while inserting data into a table.
Hi All,
I created a table.While inserting data into the table i am getting an error.Its telling "Create data Processing Function Module".Can any one help me regarding this?
Thanx in advance
anirudhHi Anirudh,
Seems there is already an entry in the Table with the same Primary Key.
INSERT Statement will give short dump if you try to insert data with same key.
Why dont you use MODIFY statement to achieve the same.
Reward points if this Helps.
Manish -
Error while load data into Essbase using ODI
Hi ,
I'm getting the following error while loading measures into Essbase using ODI, I used the same LOG nd Error file and file path for all my Dimensions , this worked well but not sure why this is not working for measures....need help.
File "<string>", line 79, in ?
com.hyperion.odi.common.ODIHAppException: c:/temp/Log1.log (No such file or directory)
Thanks
VenuAre you definitely running it against an agent where that path exists.
Have you tried using a different location and filename, have you restarted the agent to make sure there is not a lock on the file.
Cheers
John
http://john-goodwin.blogspot.com/ -
Error while importing data into Oracle 11gr2 with arcsde 9.3.1
I am getting error while importing the data into oracle 11g r2. We are using arcsde 9.3.1
It seems to be having some problem with spatial index creation.
kindly help
IMP-00017: following statement failed with ORACLE error 29855:
"CREATE INDEX "A3032_IX1" ON "DGN_POLYLINE_2D" ("SHAPE" ) INDEXTYPE IS "MDS"
"YS"."SPATIAL_INDEX""
IMP-00003: ORACLE error 29855 encountered
ORA-29855: error occurred in the execution of ODCIINDEXCREATE routine
ORA-13249: internal error in Spatial index: [mdidxrbd]
ORA-13249: Error in Spatial index: index build failed
ORA-13249: Error in spatial index: [mdrcrtxfergm]
ORA-13249: Error in spatial index: [mdpridxtxfergm]
ORA-13200: internal error [ROWID:AAAT5pAA9AACIy5AAQ] in spatial indexing.
ORA-13206: internal error [] while creating the spatial index
ORA-13033: Invalid data in the SDO_ELEM_INFO_ARRAY in SDO_GEOMETRY object
ORA-06512: at "MDSYSGuys,
I am also getting the same error and also my issue is like I am not even to analyze for which indexes I am getting error. It does not hve any indx name before error.
Processing object type DATABASE_EXPORT/SCHEMA/TABLE/INDEX/DOMAIN_INDEX/INDEX
ORA-39083: Object type INDEX failed to create with error:
ORA-29855: error occurred in the execution of ODCIINDEXCREATE routine
ORA-13249: internal error in Spatial index: [mdidxrbd]
ORA-13249: Error in Spatial index: index build failed
ORA-13249: Error in spatial index: [mdrcrtxfergm]
ORA-13249: Error in spatial index: [mdpridxtxfer]
ORA-29400: data cartridge error
ORA-12801: error signaled in parallel query server P000
ORA-13249: Error in spatial index: [mdpridxtxfergm]
ORA-13200: internal error [ROWID:AA
ORA-39083: Object type INDEX failed to create with error:
ORA-29855: error occurred in the execution of ODCIINDEXCREATE routine
ORA-13249: internal error in Spatial index: [mdidxrbd]
ORA-13249: Error in Spatial index: index build failed
ORA-13249: Error in spatial index: [mdrcrtxfergm]
ORA-13249: Error in spatial index: [mdpridxtxfer]
ORA-29400: data cartridge error
ORA-12801: error signaled in parallel query server P002
ORA-13249: Error in spatial index: [mdpridxtxfergm]
ORA-13200: internal error [ROWID:AA
ORA-39083: Object type INDEX failed to create with error:
ORA-29855: error occurred in the execution of ODCIINDEXCREATE routine
ORA-13249: internal error in Spatial index: [mdidxrbd]
ORA-13249: Error in Spatial index: index build failed
ORA-13249: Error in spatial index: [mdrcrtxfergm]
ORA-13249: Error in spatial index: [mdpridxtxfer]
ORA-29400: data cartridge error
stack cnt....
How can I find for which indexes it is failing?
Thank you,
Myra -
Error while loading data into BW (BW as Target) using Data Services
Hello,
I'm trying to extract data from SQL Server 2012 and load into BW 7.3 using Data Services. Data Services shows that the job is finished successfully. But, when I go into BW, I'm seeing the below / attached error.
Error while accessing repository Violation of PRIMARY KEY constraint 'PK__AL_BW_RE_
Please let me know what this means and how to fix this. Not sure if I gave the sufficient information. Please let me know if you need any other information.
Thanks
PradeepHi Pradeep,
Regarding your query please refer below SCN thread for the same issue:
SCN Thread:
FIM10 to BW 73- Violation of PRIMARY KEY -table AL_BW_REQUEST
Error in loading data from BOFC to BW using FIM 10.0
Thanks,
Daya -
Error while loading data into 0pur_c03 using 2lis_02_scl
Hi,
While trying to load data into 0pur_c03 by using Data Source 2lis_02_scl from R/3 the error message "The material 100 does not exist or not active" is being displayed. I have loaded master data for 0Material and performed change Run also.Still the problem is not resolved.Please suggest in resolving this issue.
Regards,
Hari Prasad B.I have the same problem in my system and i found that the material XXXXX had movements.....sales documents and more but in certain moment it material was deleted in R3 and it is not loaded into 0MATERIAL infoobject because is marked for deletion....
If you have this error must activate the load request manually and your infocube can hold the data without any problem.....
Regards -
Hello All,
I have a job to load data from SQL Server to SAP BW. I have followed the steps got from SAP wiki to do this.
1. I have created an RFC connection between two servers(SAP and BODS Job Server)
when I schedule and start the job immediately from the SAP BW, i get this error and it aborts the RFC connection....
"Error while executing the following command: -sJO return code:238"
Error message during processing in BI
Diagnosis
An error occurred in BI while processing the data. The error is documented in an error message.
System Response
A caller 01, 02 or equal to or greater than 20 contains an error meesage.
Further analysis:
The error message(s) was (were) sent by:
Scheduler
Any help would be appreciated......
Thanks
Praveen...Hi Praveen,
I want to know which version of BODS you are using for your development of ETL jobs?.
If it's BODS 12.2.2.2 then you will get this type of problems frequently as in BODS 12.2.2.2 version , only two connection
is possible to create while having RFC between BW and BODS.
So , i suggest if you are using BODS 12.2.2.2 version , then upgrade it with BODS 12.2.0.0 with Service PACK 3 and Fix Pack3 .
AS in BODS 12.2.3.3. we have option of having ten connection parallely at a time which helps in resolving this issues.
please let me know what is your BODS version and if you have upgraded your BODS to SP3 with FP3 , whether your problem is resolved or not..
All the best..!!
Thanks ,
Shandilya Saurav -
Error while loading Data into Essbase using ODI
Hi,
I am very new to ODI. I have installed ODI and working on Demo environment only. I havn't done any configuration. I am using Essbase Technology which is coming by default.
I have created one sample outline in Essbase and a text file to load data into essbase using ODI.
Following my text file.
Time Market Product Scenario Measures Data
Jan USA Pepsi Actual Sales 222
I am getting the error. I have checked in Operator. It is giving at step 6 i.e. Integration SampleLoad data into essbase.
Here is the description.
from com.hyperion.odi.common import ODIConstants
from com.hyperion.odi.connection import HypAppConnectionFactory
from java.lang import Class
from java.lang import Boolean
from java.sql import *
from java.util import HashMap
# Get the select statement on the staging area:
sql= """select C3_C1 ""Time"",C5_C2 ""Market"",C2_C3 ""product"",C6_C4 ""Scenario"",C1_C5 ""Measures"",C4_C6 ""Data"" from "C$_0Demo_Demo_genData" where (1=1) """
srcCx = odiRef.getJDBCConnection("SRC")
stmt = srcCx.createStatement()
srcFetchSize=30
stmt.setFetchSize(srcFetchSize)
rs = stmt.executeQuery(sql)
#load the data
stats = pWriter.loadData(rs)
#close the database result set, connection
rs.close()
stmt.close()
Please help me to proceed further...Hi John,
Here is the error message in execution tab....
org.apache.bsf.BSFException: exception from Jython:
Traceback (innermost last):
File "<string>", line 20, in ?
java.sql.SQLException: Unexpected token: TIME in statement [select C3_C1 ""Time]
at org.hsqldb.jdbc.jdbcUtil.sqlException(jdbcUtil.java:67)
at org.hsqldb.jdbc.jdbcStatement.fetchResult(jdbcStatement.java:1598)
at org.hsqldb.jdbc.jdbcStatement.executeQuery(jdbcStatement.java:194)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown Source)
at java.lang.reflect.Method.invoke(Unknown Source)
at org.python.core.PyReflectedFunction.__call__(PyReflectedFunction.java)
at org.python.core.PyMethod.__call__(PyMethod.java)
at org.python.core.PyObject.__call__(PyObject.java)
at org.python.core.PyInstance.invoke(PyInstance.java)
at org.python.pycode._pyx4.f$0(<string>:20)
at org.python.pycode._pyx4.call_function(<string>)
at org.python.core.PyTableCode.call(PyTableCode.java)
at org.python.core.PyCode.call(PyCode.java)
at org.python.core.Py.runCode(Py.java)
at org.python.core.Py.exec(Py.java)
at org.python.util.PythonInterpreter.exec(PythonInterpreter.java)
at org.apache.bsf.engines.jython.JythonEngine.exec(JythonEngine.java:144)
at com.sunopsis.dwg.codeinterpretor.k.a(k.java)
at com.sunopsis.dwg.dbobj.SnpSessTaskSql.scripting(SnpSessTaskSql.java)
at com.sunopsis.dwg.dbobj.SnpSessTaskSql.execScriptingOrders(SnpSessTaskSql.java)
at com.sunopsis.dwg.dbobj.SnpSessTaskSql.execScriptingOrders(SnpSessTaskSql.java)
at com.sunopsis.dwg.dbobj.SnpSessTaskSql.treatTaskTrt(SnpSessTaskSql.java)
at com.sunopsis.dwg.dbobj.SnpSessTaskSqlI.treatTaskTrt(SnpSessTaskSqlI.java)
at com.sunopsis.dwg.dbobj.SnpSessTaskSql.treatTask(SnpSessTaskSql.java)
at com.sunopsis.dwg.dbobj.SnpSessStep.treatSessStep(SnpSessStep.java)
at com.sunopsis.dwg.dbobj.SnpSession.treatSession(SnpSession.java)
at com.sunopsis.dwg.cmd.DwgCommandSession.treatCommand(DwgCommandSession.java)
at com.sunopsis.dwg.cmd.DwgCommandBase.execute(DwgCommandBase.java)
at com.sunopsis.dwg.cmd.e.i(e.java)
at com.sunopsis.dwg.cmd.g.y(g.java)
at com.sunopsis.dwg.cmd.e.run(e.java)
at java.lang.Thread.run(Unknown Source)
java.sql.SQLException: java.sql.SQLException: Unexpected token: TIME in statement [select C3_C1 ""Time]
at org.apache.bsf.engines.jython.JythonEngine.exec(JythonEngine.java:146)
at com.sunopsis.dwg.codeinterpretor.k.a(k.java)
at com.sunopsis.dwg.dbobj.SnpSessTaskSql.scripting(SnpSessTaskSql.java)
at com.sunopsis.dwg.dbobj.SnpSessTaskSql.execScriptingOrders(SnpSessTaskSql.java)
at com.sunopsis.dwg.dbobj.SnpSessTaskSql.execScriptingOrders(SnpSessTaskSql.java)
at com.sunopsis.dwg.dbobj.SnpSessTaskSql.treatTaskTrt(SnpSessTaskSql.java)
at com.sunopsis.dwg.dbobj.SnpSessTaskSqlI.treatTaskTrt(SnpSessTaskSqlI.java)
at com.sunopsis.dwg.dbobj.SnpSessTaskSql.treatTask(SnpSessTaskSql.java)
at com.sunopsis.dwg.dbobj.SnpSessStep.treatSessStep(SnpSessStep.java)
at com.sunopsis.dwg.dbobj.SnpSession.treatSession(SnpSession.java)
at com.sunopsis.dwg.cmd.DwgCommandSession.treatCommand(DwgCommandSession.java)
at com.sunopsis.dwg.cmd.DwgCommandBase.execute(DwgCommandBase.java)
at com.sunopsis.dwg.cmd.e.i(e.java)
at com.sunopsis.dwg.cmd.g.y(g.java)
at com.sunopsis.dwg.cmd.e.run(e.java)
at java.lang.Thread.run(Unknown Source) -
Error while exporting data into Excel Sheet
Hi All,
I have created a VO which is based on Query(Not based on EO) and the Query is as follows:
select f.user_name ,
f.description ,
a.currency_code,
a.amount_to ,
a.amount_from
from seacds.ar_approval_user_limits_nv a , seacds.fnd_user_nv f
where f.user_id = a.user_id
and a.document_type = 'CM'
order by 2;
Based on this VO I have created a search page which will search and returns data from the table and finally standard export button will export the data into excel sheet.
In this am searching the data based on above 5 attributes. Without entering anything in the messageTextInput if i am clicking GO button, it is returning all the data into the table region. After this if i click on Export Button, data are getting exported into Excel Sheet. It is fine.
But if i am searching the data by entering any value in any of the messageTextInput, it is returning data. But if i am clicking the Export Button then it is throwing the following error:
Exception Details.
oracle.apps.fnd.framework.OAException: oracle.jbo.SQLStmtException: JBO-27122: SQL error during statement preparation. Statement: SELECT * FROM (select f.user_name ,
f.description ,
a.currency_code,
a.amount_to ,
a.amount_from
from seacds.ar_approval_user_limits_nv a , seacds.fnd_user_nv f
where f.user_id = a.user_id
and a.document_type = 'CM'
order by 2) QRSLT WHERE (( UPPER(CURRENCY_CODE) like UPPER(:1) AND (CURRENCY_CODE like :2 OR CURRENCY_CODE like :3 OR CURRENCY_CODE like :4 OR CURRENCY_CODE like :5))) ORDER BY DESCRIPTION asc
at oracle.apps.fnd.framework.OAException.wrapperException(Unknown Source)
at oracle.apps.fnd.framework.webui.OAPageErrorHandler.prepareException(Unknown Source)
at oracle.apps.fnd.framework.webui.OAPageErrorHandler.processErrors(Unknown Source)
at oracle.apps.fnd.framework.webui.OAPageBean.processFormRequest(Unknown Source)
at oracle.apps.fnd.framework.webui.OAPageBean.preparePage(Unknown Source)
at oracle.apps.fnd.framework.webui.OAPageBean.preparePage(Unknown Source)
at oracle.apps.fnd.framework.webui.OAPageBean.preparePage(Unknown Source)
at OA.jspService(_OA.java:71)
at com.orionserver.http.OrionHttpJspPage.service(OrionHttpJspPage.java:59)
at oracle.jsp.runtimev2.JspPageTable.service(JspPageTable.java:462)
at oracle.jsp.runtimev2.JspServlet.internalService(JspServlet.java:594)
at oracle.jsp.runtimev2.JspServlet.service(JspServlet.java:518)
at javax.servlet.http.HttpServlet.service(HttpServlet.java:856)
at com.evermind.server.http.ServletRequestDispatcher.invoke(ServletRequestDispatcher.java:713)
at com.evermind.server.http.ServletRequestDispatcher.forwardInternal(ServletRequestDispatcher.java:370)
at com.evermind.server.http.HttpRequestHandler.doProcessRequest(HttpRequestHandler.java:871)
at com.evermind.server.http.HttpRequestHandler.processRequest(HttpRequestHandler.java:453)
at com.evermind.server.http.HttpRequestHandler.serveOneRequest(HttpRequestHandler.java:221)
at com.evermind.server.http.HttpRequestHandler.run(HttpRequestHandler.java:122)
at com.evermind.server.http.HttpRequestHandler.run(HttpRequestHandler.java:111)
at oracle.oc4j.network.ServerSocketReadHandler$SafeRunnable.run(ServerSocketReadHandler.java:260)
at com.evermind.util.ReleasableResourcePooledExecutor$MyWorker.run(ReleasableResourcePooledExecutor.java:303)
at java.lang.Thread.run(Thread.java:595)
## Detail 0 ##
java.sql.SQLException: Invalid column type
at oracle.jdbc.driver.DatabaseError.throwSqlException(DatabaseError.java:138)
at oracle.jdbc.driver.DatabaseError.throwSqlException(DatabaseError.java:175)
at oracle.jdbc.driver.DatabaseError.throwSqlException(DatabaseError.java:240)
at oracle.jdbc.driver.OraclePreparedStatement.setObjectCritical(OraclePreparedStatement.java:7895)
at oracle.jdbc.driver.OraclePreparedStatement.setObjectInternal(OraclePreparedStatement.java:7572)
at oracle.jdbc.driver.OraclePreparedStatement.setObjectInternal(OraclePreparedStatement.java:8183)
at oracle.jdbc.driver.OraclePreparedStatement.setObjectAtName(OraclePreparedStatement.java:8206)
Kindly give me any idea to clear this error.
Thanks and Regards,
MyvizhiHi Myvizhi ,
Did you try running the query from back end that got generated in the error trace ? see if that query is returning desired
output .
Also would like to know if you are using oracle standard search mode i.e result based search / auto customization search ?
--Keerthi -
Error while downloading data into excel file.
Hi,
I have a requirement to download data available in xstring to excel file.
I have a RFC which has export parameter 'file_data' of type xstring. When i call RFC
from web dynpro abap application it gives data out pout in xstring format.
I am opening excel file using that xstring data as below
cl_wd_runtime_services=>attach_file_to_response(
i_filename = 'file.xls'
i_content = ls_data_source-data_source ***[xstring data from RFC]
i_mime_type = 'x-excel/application'
i_in_new_window = abap_true ).
But excel file not coming in correct format. its giving an error while download
excel file
Error is
" the file you are trying to open , 'file[1].xls', is in a different format than
specified by the file extension. Verify that the file is not corrupted and is from
a trusted source before opening the file. Do you want to open the file now? "
If i click on 'YES' button file opening but not in proper excel format. Data is coming
in tab delimiter format in excel file.
Please suggest any solution for this problem.
Thanks,
Venkat.Hi Thomas,
Following is the logic implemented in RFC which is giving XSTRING as export parameter
STEP1 create a dynamic internal table
Create field catalog for the table using LVC_FIELDCATALOG_MERGE
CALL METHOD cl_alv_table_create=>create_dynamic_table
EXPORTING
it_fieldcatalog = it_fieldcat
IMPORTING
ep_table = dyn_table.
CREATE EXCEL SHEET BY SENDING FIELD CATALOG AND DATA TAB
STEP2 # Convert text table to xstring.
CALL FUNCTION 'SCMS_TEXT_TO_XSTRING'
IMPORTING
buffer = l_content
TABLES
text_tab = lt_data_tab
EXCEPTIONS
failed = 1
OTHERS = 2.
STEP3 *# Psss Data to Netweaver
PERFORM pass_data_to_nw USING is_import
CHANGING es_attachment_metadata. -
Dead lock error while updating data into cube
We have a scenario of daily truncate and upload of data into cube and volumes arrive @ 2 million per day.We have Parallel process setting (psa and data targets in parallel) in infopackage setting to speed up the data load process.This entire process runs thru process chain.
We are facing dead lock issue everyday.How to avoid this ?
In general dead lock occurs because of degenerated indexes if the volumes are very high. so my question is does deletion of Indexes of the cube everyday along with 'deletion of data target content' process help to avoiding dead lock ?
Also observed is updation of values into one infoobject is taking longer time approx 3 mins for each data packet.That infoobject is placed in dimension and defined it as line item as the volumes are very high for that specific object.
so this is over all scenario !!
two things :
1) will deletion of indexes and recreation help to avoid dead lock ?
2) any idea why the insertion into the infoobject is taking longer time (there is a direct read on sid table of that object while observed in sql statement).
Regards.hello,
1) will deletion of indexes and recreation help to avoid dead lock ?
Ans:
To avoid this problem, we need to drop the indexes of the cube before uploading the data.and rebuild the indexes...
Also,
just find out in SM12 which is the process which is causing lock.... Delete that.
find out the process in SM66 which is running for a very long time.Stop this process.
Check the transaction SM50 for the number of processes available in the system. If they are not adequate, you have to increase them with the help of basis team
2) any idea why the insertion into the infoobject is taking longer time (there is a direct read on sid table of that object while observed in sql statement).
Ans:
Lie item dimension is one of the ways to improve data load as well as query performance by eliminationg the need for dimensin table. So while loading/reading, one less table to deal with..
Check in the transformation mapping of that chs, it any rouitne/formula is written.If so, this can lead to more time for processing that IO.
Storing mass data in InfoCubes at document level is generally not recommended because when data is loaded, a huge SID table is created for the document number line-item dimension.
check if your IO is similar to doc no...
Regards,
Dhanya
Maybe you are looking for
-
I need some actionscript help. I have a DIV with an iframe inserted into it on my webpage. I am trying to get my flash button to open a URL within that iframe without refreshing the entire page. Is there a way to accomplish this? Here are some detail
-
When I open photoshop elements and click on the editor a message box appears that states PSE 12 Editor has stopped working. Help?
-
All, we are using SRM Version 5.0 and have had SRM Contracts working with CCM system, (not MDM), for some time but without using price scales. We populate SRM contracts in CM by loading from a Procurement Catalog. Price scales exist in the procuremen
-
Pear PHP Warnings [SOLVED]
I recently built a web-server and followed the arch wiki docs to complete the LAMP installation and everything ran smoothly. However, now when I upgrade pear I get the following warnings: # pear upgrade PHP Warning: Module 'mcrypt' already loaded in
-
I keep trying to update my software but get a message telling me the older version of Bonjour cannot be removed what can I do to fix this