Problem loading data with DTP
Hi everyone,
I trying to load data from a DSO to a INFOCUBE, the problem is as follow. Normally in a DTP the system process packages in the request. In this case the DTP don't proccess any package but finish with status green.
When I manage the target, I see that the request finish in red.
The symptom is that the DTP don't process any request. I don´t know why.
Status: The DSO have active data, and the infocube is empty.
I don't know what happen, but I've not carry the data from the DSO to the target.
I hope that you help me,
Regards.
Jose
Answering your questions:
Are you seeing the request red in Manage and Green in Monitor Details ?
Yes in manage I see the requests in red, but in the monitor details all is in green, although there are not requests processed in the monitor details..
Are the requests green and active in the source DSO ?
Ok, this DSO is direct input, an it's filled by a APD process. So the DSO don't have active request. This DSO Just have the data in table of active data. I think that it doesn't matter because I'm trying to do the same thing between two infocubes and doesn't work too.
I'm looking that the problem y general for all warehouse.
Additional, this DTP works correct the last week.
This week begins to fail.
Could be a System problem.
What can I check??
Also check if this is an auth issue - SU53 ..
I check SU53 but all it's ok.
In Monitor Header - Selections - You should see the requests which have been loaded from source.
I don't see anything in this field.
Thanks
Jose
Similar Messages
-
Data with huge volume of data with DTP
Hi Experts,
I have this problem with upload of huge volume of data with DTPs.
I have my initialisation done as I am doing reloads, Now I have this data from fiscal year period 000.2010 to 016.9999.
I have huge volume of data.
I have tried uploading this data in chunks by dividing 3 months for each DTP and had made full load.
But when I processed the DTP the data packages are decided at source and I have about 2000 data packages.
Now my request is turning to red after processing about 1000 datapackages, batch processes allocated to this also stopped.
I have tried dividing DTP only by month and processed the DTP I have same problem. I have deleted the indexes before uplaoding to the cube, Changed the setting battch processing from 3 to 5.
Please can any one advise what could be problem.I am uplaoding this reloads in quality system.
How can upload this data which are in millions.
Thanks,
TatiHi Galban,
I have made the parallel processing from 3 to 5 even and the datapakcage size
Can you please advise in this area how can I increase the data package size as the data package size for my upload is the package size corresponds to package size in source it is determined dynammically at runtime.
Please advise.
Thanks
Tati -
Sql@loader-704 and ORA-12154: error messages when trying to load data with SQL Loader
I have a data base with two tables that is used by Apex 4.2. One table has 800,000 records . The other has 7 million records
The client recently upgraded from Apex 3.2 to Apex 4.2 . We exported/imported the data to the new location with no problems
The source of the data is an old mainframe system; I needed to make changes to the source data and then load the tables.
The first time I loaded the data i did it from a command line with SQL loader
Now when I try to load the data I get this message:
sql@loader-704 Internal error: ulconnect OCISERVERATTACH
ORA-12154: tns:could not resolve the connect identifier specified
I've searched for postings on these error message and they all seem to say that SQL Ldr can't find my TNSNAMES file.
I am able to connect and load data with SQL Developer; so SQL developer is able to find the TNSNAMES file
However SQL Developer will not let me load a file this big
I have also tried to load the file within Apex (SQL Workshop/ Utilities) but again, the file is too big.
So it seems like SQL Loader is the only option
I did find one post online that said to set an environment variable with the path to the TNSNAMES file, but that didn't work..
Not sure what else to try or where to look
thanksHi,
You must have more than one tnsnames file or multiple installations of oracle. What i suggest you do (as I'm sure will be mentioned in ed's link that you were already pointed at) is the following (* i assume you are on windows?)
open a command prompt
set TNS_ADMIN=PATH_TO_DIRECTOT_THAT_CONTAINS_CORRECT_TNSNAMES_FILE (i.e. something like set TNS_ADMIN=c:\oracle\network\admin)
This will tell oracle use the config files you find here and no others
then try sqlldr user/pass@db (in the same dos window)
see if that connects and let us know.
Cheers,
Harry
http://dbaharrison.blogspot.com -
Short dump:ASSIGN_TYPE_CONFLICT- While loading data through DTP
Dear all:
We currently work with BI NW2004s SP10. We created transformation for mapping InfoSource and InfoCube based on 3.x transfer rule. For example, we used cube 0PUR_C04 and Data Source 2LIS_02_ITM_CP. And the transformation is "TRCS Z2LIS_02_ITM_CP -> CUBE 0PUR_C04". Everytime when we tried to load data via DTP. A runtime short dump occurred: ASSIGN_TYPE_CONFLICT
Error analysis:
You attempted to assign a field to a typed fie but the field does not have the required type.
we went back and forth to activated transformation and DTP. But still, same error occurred.
Any idea, please !!!!
BR
SzuFenHi Pavel:
Please refer to the following information-
User and Transaction
Client.............. 888
User................ "TW_S
Language key........ "E"
Transaction......... " "
Program............. "GPD0
Screen.............. "SAPM
Screen line......... 6
===========================================================
Information on where terminated
Termination occurred in the ABAP program "GPD0QBVJ2WFQZZXBD0IJ1DSZAEL" - in
"EXECUTE".
The main program was "RSBATCH_EXECUTE_PROZESS ".
In the source code you have the termination point in line 704
of the (Include) program "GPD0QBVJ2WFQZZXBD0IJ1DSZAEL".
The program "GPD0QBVJ2WFQZZXBD0IJ1DSZAEL" was started as a background job.
Job Name....... "BIDTPR_284_1"
Job Initiator.. "TW_SZU"
Job Number..... 16454800
===========================================================
Short text
Type conflict with ASSIGN in program "GPD0QBVJ2WFQZZXBD0IJ1DSZAEL".
===========================================================
Error analysis
You attempted to assign a field to a typed field symbol,
but the field does not have the required type.
===========================================================
Information on where terminated
Termination occurred in the ABAP program "GPD0QBVJ2WFQZZXBD0IJ1DSZAEL" - in
"EXECUTE".
The main program was "RSBATCH_EXECUTE_PROZESS ".
In the source code you have the termination point in line 704
of the (Include) program "GPD0QBVJ2WFQZZXBD0IJ1DSZAEL".
The program "GPD0QBVJ2WFQZZXBD0IJ1DSZAEL" was started as a background job.
Job Name....... "BIDTPR_284_1"
Job Initiator.. "TW_SZU"
Job Number..... 16454800
===========================================================
Line SourceCde
674 ELSE.
675 ASSIGN rdsTG_1->* to <_ys_TG_1>.
676 CLEAR <_ys_TG_1>.
677 MOVE-CORRESPONDING G1 TO <_ys_TG_1>.
678 <_ys_TG_1>-requid = l_requid.
679 l_recno_TG_1 = l_recno_TG_1 + 1.
680 ls_cross-insegid = 1.
681 ls_cross-inrecord = l_recno_SC_1.
682 ls_cross-outsegid = 1.
683 ls_cross-outrecord = l_recno_TG_1.
684
685 CALL METHOD i_r_log->add_cross_tab
686 EXPORTING
687 I_S_CROSSTAB = ls_cross.
688
689 ** Record# in target = sy-tabix - if sorting of table won't be changed
690 <_ys_TG_1>-record = l_recno_TG_1.
691 INSERT <_ys_TG_1> INTO TABLE <_yth_TG_1>.
692 IF sy-subrc <> 0.
693 CALL METHOD cl_rsbm_log_step=>raise_step_failed_callstack.
694 ENDIF.
695
696 ENDIF. "Read table
697 *
698 ENDIF.
699 CLEAR skipseg_all.
700 ENDLOOP.
701 * -
insert table into outbound segment -
702
703 <_yt_TG_1>[] = <_yth_TG_1>[].
>>>>>
705 rTG_1->insert_table( rdtTG_1_dp ).
706 ENDMETHOD. "execute
707
708
709
710 endclass. "lcl_transform IMPLEMENTATION
711
712 &----
713 *& Form get_runtime_ref
714 &----
715 * text
716 ----
717 * -->C_R_EXE text
718 ----
719 form get_runtime_ref
720 changing c_r_exe type ref to object.
721
722 data: l_r_exe type ref to lcl_transform.
723 create object l_r_exe.
===========================================================
Contents of system fields
Name Val.
SY-SUBRC 0
SY-INDEX 3
SY-TABIX 0
SY-DBCNT 1
SY-FDPOS 0
SY-LSIND 0
SY-PAGNO 0
SY-LINNO 1
SY-COLNO 1
SY-PFKEY
SY-UCOMM
SY-TITLE Execute Batch Process
SY-MSGTY E
SY-MSGID R7
SY-MSGNO 057
SY-MSGV1 0TOTDELTIME
SY-MSGV2 A
SY-MSGV3
SY-MSGV4
SY-MODNO 0
SY-DATUM 20070420
SY-UZEIT 164557
SY-XPROG SAPCNVE
SY-XFORM CONVERSION_EXIT
===========================================================
Active Calls/Events
No. Ty. Program Include
Name
6 METHOD GPD0QBVJ2WFQZZXBD0IJ1DSZAEL GPD0QBVJ2WFQZZXBD0IJ1DSZAEL
LCL_TRANSFORM=>EXECUTE
5 METHOD CL_RSTRAN_TRFN_CMD============CP CL_RSTRAN_TRFN_CMD============CM005
CL_RSTRAN_TRFN_CMD=>IF_RSBK_CMD_T~TRANSFORM
4 METHOD CL_RSBK_PROCESS===============CP CL_RSBK_PROCESS===============CM00Q
CL_RSBK_PROCESS=>PROCESS_REQUEST
3 METHOD CL_RSBK_PROCESS===============CP CL_RSBK_PROCESS===============CM002
CL_RSBK_PROCESS=>IF_RSBATCH_EXECUTE~EXECUTE
2 FUNCTION SAPLRSBATCH LRSBATCHU13
RSBATCH_EXECUTE_PROCESS
1 EVENT RSBATCH_EXECUTE_PROZESS RSBATCH_EXECUTE_PROZESS
START-OF-SELECTION
===========================================================
Thank you and BR
SF -
Load Data with 7.0 DataSource from Falt file to Write Optimized DSO
Hi all,
we have a problem loading data from flat file using the 7.0 datasource.
We have to load a flat file (monthly) into a WO DSO. The infopackage load file in full mode into the Datasource (PSA) and the DTP Load in delta mode data from datasource into the WO DSO.
When i load the second file in the Datasource, the DTP load all data present in the Datasource and not only the new one as aspected using Delta mode.
Has anyone any tips to help me?
Thank you for help.
Regards
EmilianoHi,
Iam facing the similar problem.
Iam using Write Optimized DSO and i have got only 1 req in PSA (have deleted all previous req from PSA and DSO).
When iam doing a delta load from PSA to DSO, i expect to see only that 1 req to get loaded into DSO.
But, its picking up the data from 3 other reqests and doubling the records...
Can you please help me, how did you managed to get out of that isue?
Cheers,
Nisha -
Problem loading data from the PSA to the InfoCube
Hello experts.
I'm having a problem loading data from the PSA to the InfoCube.
I'm using a DTP for this process but is happening the following error:
"Diagnosis
An error occurred while executing the transformation rule:
The exact error message is:
Overflow converting from''
The error was triggered at the point in the Following Program:
GP4KMDU7EAUOSBIZVE233WNLPIG 718
System Response
Processing the record date has Been terminated.
Procedure
The Following is additional information included in the higher-level
node of the monitor:
Transformation ID
Data record number of the source record
Number and the name of the rule Which produced the error
Procedure for System Administration
Have already created new DTP's deactivate and reactivate the InfoCube, the transformation, but solves nothing.
Does anyone have any idea what to do?
Thank you.HI,
Is it a flat file load or loading frm any data source?
try to execute the program GP4KMDU7EAUOSBIZVE233WNLPIG 718 in Se38 and check if its active and no syntax errors are there.
Check the mapping of the fileds in transformations weather
some data fileds are mapped to decimal or char 32 filed is mapped to Raw 16
or calweek, calmonth mapped to calday etc.
Check in St22 if there any short dumps ..
Regards
KP -
SQL * Loader : Load data with format MM/DD/YYYY HH:MI:SS PM
Please advice how to load data with format MM/DD/YYYY HH:MI:SS PM into an Oracle Table using SQL * Loader.
- What format should I give in the control file?
- What would be the column type to create the table to load data.
Sample data below;
MM/DD/YYYY HH:MI:SS PM
12/9/2012 2:40:20 PM
11/29/2011 11:23:12 AM
Thanks in advance
AvinashHello Srini,
I had tried with the creation date as DATE datatype but i had got an error as
ORA-01830: date format picture ends before converting entire input stringI am running the SQL*LOADER from Oracle R12 EBS front-end.
the contents of my control file is
LOAD DATA
INFILE "$_FileName"
REPLACE
INTO TABLE po_recp_int_lines_stg
WHEN (01) = 'L'
FIELDS TERMINATED BY "|"
OPTIONALLY ENCLOSED BY '"'
TRAILING NULLCOLS
INDICATOR POSITION(1) CHAR,
TRANSACTION_MODE "TRIM(:TRANSACTION_MODE)",
RECEIPT_NUMBER "TRIM(:RECEIPT_NUMBER)",
INTERFACE_SOURCE "TRIM(:INTERFACE_SOURCE)",
RECEIPT_DATE "TO_CHAR(TO_DATE(:RECEIPT_DATE,'MM/DD/YYYY'),'DD-MON-YYYY')",
QUANTITY "TRIM(:QUANTITY)",
PO_NUMBER "TRIM(:PO_NUMBER)",
PO_LINE_NUMBER "TRIM(:PO_LINE_NUMBER)",
CREATION_DATE "TO_CHAR(TO_DATE(:CREATION_DATE,'MM/DD/YYYY HH:MI:SS AM'),'DD-MON-YYYY HH:MI:SS AM')",
ERROR_MESSAGE "TRIM(:ERROR_MESSAGE)",
PROCESS_FLAG CONSTANT 'N',
CREATED_BY "fnd_global.user_id",
LAST_UPDATE_DATE SYSDATE,
LAST_UPDATED_BY "fnd_global.user_id"
{code}
My data file goes like
{code}
H|CREATE|123|ABC|12/10/2012||||
L|CREATE|123|ABC|12/10/2012|100|PO12345|1|12/9/2012 2:40:20 PM
L|CORRECT|123|ABC|12/10/2012|150|PO12346|2|11/29/2011 11:23:12 AM{code}
Below is the desc of the table
{code}
INDICATOR VARCHAR2 (1 Byte)
TRANSACTION_MODE VARCHAR2 (10 Byte)
RECEIPT_NUMBER NUMBER
INTERFACE_SOURCE VARCHAR2 (20 Byte)
RECEIPT_DATE DATE
QUANTITY NUMBER
PO_NUMBER VARCHAR2 (15 Byte)
PO_LINE_NUMBER NUMBER
CREATION_DATE TIMESTAMP(0)
ERROR_MESSAGE VARCHAR2 (4000 Byte)
PROCESS_FLAG VARCHAR2 (5 Byte)
CREATED_BY NUMBER
LAST_UPDATE_DATE DATE
LAST_UPDATED_BY NUMBER {code}
Thanks,
Avinash -
Problem loading data into write optimized dso.....
Hi ,
I am having problem loading the data from PSA to write optimised DSO
I have changed the normal DSO into a Write Optimised DSO. I have 2 data sources to be loaded into the write optimized DSO. One for Demand and one for Inventory.
The loading of Demand data from PSA to DSO happens fine with out any error.
Now while loading the Inventory data from PSA to DSO , i get the below errors
"Data Structures were changed. Start Transaction before hand"
Execption CX_RS_FAILED Logged
I have tried reactivating the DSO, Transformation, DTP and tried to load the data again to the write optimised DSO, but no luck, I get the above error message always.
Can some one please suggest me what could be done to avoid the above error messages and load the data successfully.
Thanks in advance.Hi,
Check the transformations is there any routines written.
Check your Data struture of cube and DS as well.
Is there any changes in structure.
Data will load upto PSA normally, if is there any changes in struture of DSO, then the error may occur.just check it.
Check the below blog:
/people/martin.mouilpadeti/blog/2007/08/24/sap-netweaver-70-bi-new-datastore-write-optimized-dso
Let us know status.........
Reg
Pra -
How can I load data with Scripts on *FDM* to a HFM target System????
Hi all!
I need help because I can´t find a good guide about scripting on FDM. The problem I have is the next one.
I have on mind to load my data with data load file in FDM to a HFM target system, but I would like to load an additional data using an event script, ie after validate. I would need any way to access to HFM system though FDM Scripts, is it possible??
If so, It would be wonderful to get a data from HFM with any Point of View, reachable from FDM Scripts in order to load or getting any data.
I´ve looking for a good guide about scripting in FDM but I couldn´t find any information about accessing data on HFM target system, does it really exist?
Thanks for helpHi,
Take a look at the LOAD Action scripts of your adapter. This might give you an idea.
Theoretically it should be possible to load data in an additional load, but you need to be very careful. You don't want to corrupt any of the log and status information that is being stored during the load process. The audit trail is an important feature in many implementations. In this context it might not be a good idea to improve automation and risk compliance of your system.
Regards,
Matt -
URGENT: Problems Loading files with SQL Loader into a BLOB column
Hi friends,
I read a lot about how to load files into blob columns, but I found errors that I can't solve.
I've read several notes in these forums, ine of them:
sql loader: loading external file into blob
and tried the solutions but without good results.
Here are some of my tests:
With this .ctl:
LOAD DATA
INFILE *
INTO TABLE mytable
REPLACE
FIELDS TERMINATED BY ','
number1 INTEGER EXTERNAL,
cad1 CHAR(250),
image1 LOBFILE(cad1) TERMINATED BY EOF
BEGINDATA
1153,/opt/oracle/appl/myapp/1.0.0/img/1153.JPG,
the error when I execute sqlldr is:
SQL*Loader-350: Syntax error at line 9.
Expecting "," or ")", found "LOBFILE".
image1 LOBFILE(cad1) TERMINATED BY EOF
^
What problem exists with LOBFILE ??
(mytable of course has number1 as a NUMBER, cad1 as VARCHAR2(250) and image1 as BLOB
I tried too with :
LOAD DATA
INFILE sample.dat
INTO TABLE mytable
FIELDS TERMINATED BY ','
(cad1 CHAR(3),
cad2 FILLER CHAR(30),
image1 BFILE(CONSTANT "/opt/oracle/appl/myapp/1.0.0/img/", cad2))
sample.dat is:
1153,1153.JPEG,
and error is:
SQL*Loader-350: Syntax error at line 6.
Expecting "," or ")", found "FILLER".
cad2 FILLER CHAR(30),
^
I tried too with a procedure, but without results...
Any idea about this error messages?
Thanks a lot.
Jose L.> So you think that if one person put an "urgent" in the subject is screwing the problems of
other people?
Absolutely. As you are telling them "My posting is more important than yours and deserve faster attention and resolution than yours!".
So what could a typical response be? Someone telling you that his posting is more important by using the phrase "VERY URGENT!". And the next poster may decide that, no, his problem is evern more import - and use "EXTREMELY URGENT!!" as the subject. And the next one then raises the stakes by claiming his problem is "CODE RED! CRITICAL. DEFCON 4. URGENT!!!!".
Stupid, isn't it? As stupid as your instance that there is nothing wrong with your pitiful clamoring for attention to your problem by saying it is urgent.
What does the RFC's say about a meaningful title/subject in a public forum? I trust that you know what a RFC is? After all, you claim to have used public forums on the Internet for some years now..
The RFC on "public forums" is called The Usenet Article Format. This is what it has to say about the SUBJECT of a public posting:
=
The "Subject" line (formerly "Title") tells what the message is about. It should be suggestive enough of the contents of the message to enable a reader to make a decision whether to read the message based on the subject alone. If the message is submitted in response to another message (e.g., is a follow-up) the default subject should begin with the four characters "Re: ", and the "References" line is required. For follow-ups, the use of the "Summary" line is encouraged.
=
([url http://www.cs.tut.fi/~jkorpela/rfc/1036.html]RFC 1036, the Usenet article format)
Or how about [url http://www.cs.tut.fi/~jkorpela/usenet/dont.html]The seven don'ts of Usenet?
Point 7 of the Don'ts:
Don't try to catch attention by typing something foolish like "PLEASE HELP ME!!!! URGENT!!! I NEED YOUR HELP!!!" into the Subject line. Instead, type something informative (using normal mixed case!) that describes the subject matter.
Please tell me that you are not too thick to understand the basic principles of netiquette, or to argue with the RFCs that governs the very fabric of the Internet.
As for when I have an "urgent" problem? In my "real" work? I take it up with Oracle Support on Metalink by filing an iTAR/SR. As any non-idiot should do with a real-life Oracle crisis problem.
I do not barge into a public forum like you do, jump up and down, and demand quick attention by claiming that my problem is more important and more urgent and more deserving of attention that other people's problem in the very same forum. -
Hello Expert,
I loaded data from SAP R/3 into PSA using Delta mode and found that a new record which was just created was not loaded into SAP BI. What could be possible?
Step of loading
1. Initial without data into PSA
2. Full load with criteria Fiscal Period (e.g. 001.2009 - 016.2009) into PSA
3. Load data from PSA into DSO with update mode = Delta
4. Create a new transaction from SAP R/3
5. Load data from SAP R/3 into SAP BI-PSA with update mode = Delta
Expected Result: A new record should be loaded into PSA.
Actual Result: There was no record loaded.
After initial loading without data, is it necessary to full load with all fiscal period. And then load with Delta mode?
I have never seen this kind of problem before.
Is anyone familiar with this kind of problem? How did you resolve it?
Any suggestion would be appreciated.
Thank you very much
W.J.Hi,
Is your Datasource is Logistics? if so how the job has been scheduled in LO cockpit (hourly / daily)
Did you check the record in Delta Queue(RSA7) ??
After initial loading without data, is it necessary to full load with all fiscal period. And then load with Delta mode?
I have never seen this kind of problem before.
If you have specific selections you have to do it by repair full load .. and it doesn't impact delta loads -
Problem loading data in cube 0REFX_C03
Hi all,
Has anyone had problem with loading data to 0REFX_C03 cube?
I am trying to load, but it just takes too long(predicted time of 10 days!!) and times out after a while..
Please advise if anyone has got any suggestions..
Thank you.
Edited by: Shalini on Feb 6, 2008 8:48 AMThats great Shalini.
If it is helpful..please assign points.
If you need any information on that please mail me [email protected]
Regards,
RK Ghattamaneni. -
Problem loading data from jena
Hi, two issues when loading data into Oracle from a jena model:
1. The incremental and batch load both works well except when we add a triple with a literal types as double:
triple = new Triple(dirNode.asNode(), Node.createURI("http://www.w3.org/2003/01/geo/wgs84_pos#long"), Node.createLiteral(geopos.getLongitude().toString(), null, (RDFDatatype) XSDDatatype.XSDdouble));
graph.add(triple);
We get the error:
GRAVE: Could not add triple
java.sql.BatchUpdateException: ORA-55303: Fallo en el constructor SDO_RDF_TRIPLE_S: Simple case: SQLERRM=ORA-55328: fallo al intentar insertar el valor literal "-5.9278863"^^<http://www.w3.org/2001/XMLSchema#double>
ORA-06512: en "MDSYS.MD", línea 1723
ORA-06512: en "MDSYS.MDERR", línea 17
ORA-06512: en "MDSYS.SDO_RDF_TRIPLE_S", línea 211
at oracle.jdbc.driver.OracleStatement.doExecuteWithTimeout(OracleStatement.java:1335)
at oracle.jdbc.driver.OraclePreparedStatement.executeInternal(OraclePreparedStatement.java:3449)
at oracle.jdbc.driver.OraclePreparedStatement.executeUpdate(OraclePreparedStatement.java:3530)
at oracle.jdbc.driver.OraclePreparedStatementWrapper.executeUpdate(OraclePreparedStatementWrapper.java:1062)
2. The bulk load simply does not work:
((OracleBulkUpdateHandler) graph.getBulkUpdateHandler()).addInBulk(GraphUtil.findAll(model.getGraph()), "sem_ts");
We get:
01-oct-2009 13:11:39 oracle.spatial.rdf.client.jena.SimpleLog warn
ADVERTENCIA: addInBulk: [92 ] sqle
java.sql.SQLException: ORA-44004: nombre de SQL cualificado no válido
ORA-06512: en "SYS.DBMS_ASSERT", línea 188
ORA-06512: en "MDSYS.SDO_RDF", línea 242
ORA-06512: en "MDSYS.RDF_APIS", línea 693
ORA-06512: en línea 1
at oracle.jdbc.driver.T4CTTIoer.processError(T4CTTIoer.java:439)
at oracle.jdbc.driver.T4CTTIoer.processError(T4CTTIoer.java:395)
at oracle.jdbc.driver.T4C8Oall.processError(T4C8Oall.java:802)
In both case our conexion is something like:
public static String conexion = "jdbc:oracle:thin:user/pass@ourserver:1521:ourdb";
Any idea? ThanksHi Wu, we have included your code in a java test ang got the same problem.
Our installation is Oracle Database 11.2.0.1.0. Then we added the 'Semantic patch' 11_2_sem and followed the instructions to create a tablespace and the RDF/SEM network. Finally we create a model as [explained here|http://download.oracle.com/docs/cd/E11882_01/appdev.112/e11828/sdo_rdf_concepts.htm#CHDEDFFA] .
Some text in the exception is spanish; basically it seems to say 'fails inserting the literal value'... The rest of data in the app has been correctly inserted.
This is the java test:
public class PruebaOracleTest extends TestCase {
String jdbcUrl = "jdbc:oracle:thin:user/pass@server:1521:bd";
public void testInsertData() throws Exception {
Oracle oracle = new Oracle(jdbcUrl, null, null);
GraphOracleSem graph = new GraphOracleSem(oracle, "ARTICLES");
ModelOracleSem model = new ModelOracleSem(graph);
Model inMemoryJenaModel = ModelFactory.createDefaultModel();
long lStartTime = System.currentTimeMillis();
System.out.println("testCustomerMisc: start");
Triple t = new Triple(Node.createURI("http://sub"), Node
.createURI("http://www.w3.org/2003/01/geo/wgs84_pos#long"),
Node.createLiteral("-5.9278863", null,
(RDFDatatype) XSDDatatype.XSDdouble));
graph.add(t);
graph.flushAdd();
String queryString = "SELECT * " + "WHERE { "
+ " ?subject ?predicate ?object . " + "} ";
Query query = QueryFactory.create(queryString);
QueryExecution qexec = QueryExecutionFactory.create(query, model);
ResultSet results;
results = qexec.execSelect();
ResultSetFormatter.out(System.out, results, query);
public void testListTriples() throws Exception {
Oracle oracle = new Oracle(jdbcUrl, null, null);
GraphOracleSem graph = new GraphOracleSem(oracle, "ARTICLES");
int cont = 0;
ExtendedIterator it = graph.find(Triple.ANY);
while (it.hasNext() && cont<100) {
Triple t = (Triple) it.next();
System.out.println(t.toString());
cont++;
graph.close();
oracle.dispose();
public void testCleanModel() throws Exception {
Oracle oracle = new Oracle(jdbcUrl, null, null);
GraphOracleSem graph = new GraphOracleSem(oracle, "ARTICLES");
ModelOracleSem model = new ModelOracleSem(graph);
model.removeAll();
graph.close();
oracle.dispose();
And this the exception we get:
java.sql.SQLException: ORA-55303: Fallo en el constructor SDO_RDF_TRIPLE_S: Simple case: SQLERRM=ORA-55328: fallo al intentar insertar el valor literal "-5.9278863"^^<http://www.w3.org/2001/XMLSchema#double>
ORA-06512: en "MDSYS.MD", línea 1723
ORA-06512: en "MDSYS.MDERR", línea 17
ORA-06512: en "MDSYS.SDO_RDF_TRIPLE_S", línea 211
at oracle.jdbc.driver.T4CTTIoer.processError(T4CTTIoer.java:439)
at oracle.jdbc.driver.T4CTTIoer.processError(T4CTTIoer.java:395)
at oracle.jdbc.driver.T4C8Oall.processError(T4C8Oall.java:802)
at oracle.jdbc.driver.T4CTTIfun.receive(T4CTTIfun.java:436)
at oracle.jdbc.driver.T4CTTIfun.doRPC(T4CTTIfun.java:186)
at oracle.jdbc.driver.T4C8Oall.doOALL(T4C8Oall.java:521)
at oracle.jdbc.driver.T4CPreparedStatement.doOall8(T4CPreparedStatement.java:205)
at oracle.jdbc.driver.T4CPreparedStatement.executeForRows(T4CPreparedStatement.java:1008)
at oracle.jdbc.driver.OracleStatement.doExecuteWithTimeout(OracleStatement.java:1307)
at oracle.jdbc.driver.OraclePreparedStatement.sendBatch(OraclePreparedStatement.java:3753)
at oracle.jdbc.driver.OraclePreparedStatementWrapper.sendBatch(OraclePreparedStatementWrapper.java:1140)
at oracle.spatial.rdf.client.jena.GraphOracleSem.flushAdd(GraphOracleSem.java:1219)
at org.fundacionctic.ogd.data.support.PruebaOracleTest.testInsertData(PruebaOracleTest.java:42)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
at java.lang.reflect.Method.invoke(Method.java:597)
at junit.framework.TestCase.runTest(TestCase.java:154)
at junit.framework.TestCase.runBare(TestCase.java:127)
at junit.framework.TestResult$1.protect(TestResult.java:106)
at junit.framework.TestResult.runProtected(TestResult.java:124)
at junit.framework.TestResult.run(TestResult.java:109)
at junit.framework.TestCase.run(TestCase.java:118)
at org.eclipse.jdt.internal.junit.runner.junit3.JUnit3TestReference.run(JUnit3TestReference.java:130)
at org.eclipse.jdt.internal.junit.runner.TestExecution.run(TestExecution.java:38)
at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRunner.java:460)
at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRunner.java:673)
at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.run(RemoteTestRunner.java:386)
at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.main(RemoteTestRunner.java:196) -
Unable to load data with impdp
Hi friends,
I've encountered with follwing errors while loading data through impdp utility.
ORA-31626: job does not exist
ORA-31633: unable to create master table "TRACE.SYS_IMPORT_FULL_05"
ORA-06512: at "SYS.DBMS_SYS_ERROR", line 95
ORA-06512: at "SYS.KUPV$FT", line 863
ORA-00955: insufficient privileges
I think problem is with last line ORA-00955: insufficient privileges what are you opinion kindly tell me what necessary priviliges user should have to import/export dump file.
Looking for you Help and suggestion
Regards,
AbbasiIs this dumpfile consists of onlyTRACE schema objects or other schema objects?
no need to grant dba priviliges to trace, you can import using sys/system user.
impdp system/****@TNRDB directory=tnr_dump_dir dumpfile-tnrsms.dmp logfile=loading.logThanks -
Problem download data with trailing blanks using gui_download
Hi All,
I am using SAP 4.6C. I have requirement to download data with trailing
blanks.But i am not succeeded fully to acheive my requirement.I want to
download data to be in the below format.
Eg: 'T000000070600000004000000003593593 '
I want to download data with different blanks of each record.
For example in 1 line 18, 2nd line 25, 3rd line 22.
I already used Hexa notation value '09'. But it is showing only 7 at once. I have
added 3 times this field but it is showing 21. How i can i add extra space?
I tried to move ' ' spaces also. It does't wotk. I have tried concatenate with ' ' and
separared by ' ' also but no use.
I have passed all possible parameters to GUI_DOWNLOAD FM.
say write_field_separator = ' '
write_lf = 'X'
trunc_trailing_blanks = ' '
trunc_trailing_blanks_eol = ' '
Could please tell me some one had faced same problem earlier. It is very urgent.
Is there is any class attribute in SAP 4.6C?
( like cl_abap_char_utilities in 4.7 )
Thanks in Advance.
BashaTry :
CONCATENATE XXXX space RESPECTING BLANKS.
Of course you must retain these parameters :
trunc_trailing_blanks = space
trunc_trailing_blanks_eol = space
Hope this helps
Maybe you are looking for
-
Exporting projects that reference images in other projects
Hello, all. I'm wondering if anyone out there has had luck exporting projects that reference images in another project. I've just given this a try, and I get empty images (dotted gray outlines that can't be multi-selected and can't be "managed" as re
-
Captivate 3: Getting a URL to open in a new page (help!)
Hi, I was wondering if anyone would be able to help me-it'd be much appreciated! I'm using captivate 3 to combine a power point presentation and an audio file to publish as a flash player on the internet. I need to include some form of assessment for
-
ITunes and Windows Server 2008 (32-bit Enterprise)?
I tried installing iTunes 7.2 and 7.6.1 on my 32-bit Windows Server 2008 testing machine. It seems that 7.2 (with the standalone Quicktime 7.1.6 installed first) installed without any problem, but 7.6.1 seems that it cannot install some components (s
-
I try enable all user address list (mailling list) as the global through Directory Server service where any user login to mail system (Messaging) they can check or view global address book but I can't see any global address book appear at user or adm
-
Cicso beginner here... We have an 891 and do not have a Firewall configured. When we go to grc.com and do the Shields Up test, it shows most ports are closed and some are open. No ports are "stealth". How do we configure the Firewall so most ports