Unable to replicate oracle data into timesten
I have created CACHE GROUP COMPANY_MASTER
Cache group :-
Cache Group TSLALGO.COMPANY_MASTER_TT:
Cache Group Type: Read Only
Autorefresh: Yes
Autorefresh Mode: Incremental
Autorefresh State: On
Autorefresh Interval: 1 Minute
Autorefresh Status: ok
Aging: No aging defined
Root Table: TSLALGO.COMPANY_MASTER
Table Type: Read Only
But whenever I start timesten server the following lock seen in ttxactadmin <dsn_name>
Program File Name: timestenorad
30443 0x7fab902c02f0 7.22 Active Database 0x01312d0001312d00 IX 0
Table 1733200 S 4221354128 TSLALGO.COMPANY_MASTER
Row BMUFVUAAAAaAAAAFBy S 4221354128 SYS.TABLES
Row BMUFVUAAACkAAAALAF Sn 4221354128 SYS.CACHE_GROUP
Due to the following lock oracle data is not replicated in timesten.
When we check sqlcmdid it shows following output
Query Optimizer Plan:
Query Text: CALL ttCacheLockCacheGp(4, '10752336#10751968#10751104#10749360#', 'S', '1111')
STEP: 1
LEVEL: 1
OPERATION: Procedure Call
TABLENAME:
TABLEOWNERNAME:
INDEXNAME:
INDEXEDPRED:
NONINDEXEDPRED:
Please suggest why timesten take lock on following table.
966234 wrote:
Unable to download Oracle Data Integrator with version 11.1.1.6.Hope this could be resolved ASAP.What is the file you are trying to download? Is it for Windows or Linux or All Platforms?
Thanks,
Hussein
Similar Messages
-
How to import data into TimesTen from Oracle
I want to import data from Oracle into TimesTen,but not found tools provided by TimesTen. I hope TimesTen may convert Oracle's dump file(exp).
thanks for your suggest.TimesTen cannot read Oracle's data pump export files.
If you are using Cache Groups to cache Oracle tables into TimesTen, then the LOAD CACHE GROUP statement will automatically load the data from Oracle into TimesTen for you.
If you are using TimesTen as a standalone database, then you can SPOOL out the Oracle table data into text files, and use them as data files to feed the TimesTen ttbulkcp command line utility. ttbulkcp is similar to SQL*Loader except it handles both import and export of data.
For example, the following command dumps the contents of the table foo from a TimesTen database mydb into a file called foo.dump.
ttbulkcp -o dsn=mydb foo foo.dump
The file foo.dump will give you an example of what the default ttbulkcp dump file format looks like. You can then generate a new dump file based on the above, but populate it using the data from Oracle.
The following command loads the rows listed in file foo.dump into a table called foo in database mydb, placing any error messages into the log file foo.err.
ttbulkcp -i -e foo.err dsn=mydb foo foo.dump
For more information on the ttbulkcp utility you can refer to the Oracle TimesTen API Reference Guide.
You can also use SQL Developer to export data from Oracle table into ttbulkcp file format, via the Export Data option. Please note that this option is meant for small tables say with a few thousand rows, exporting large tables can be slow in SQL Developer and could require a lot of client side memory. -
Unable to download Oracle Data Integrator-Version 11.1.1.6(Important)
Unable to download Oracle Data Integrator with version 11.1.1.6.Hope this could be resolved ASAP.
966234 wrote:
Unable to download Oracle Data Integrator with version 11.1.1.6.Hope this could be resolved ASAP.What is the file you are trying to download? Is it for Windows or Linux or All Platforms?
Thanks,
Hussein -
How to load oracle data into SQL SERVER 2000?
how to load oracle data into SQL SERVER 2000.
IS THERE ANY UTILITY AVAILABLE?Not a concern for an Oracle forum.
Als no need for SHOUTING.
Conventional solutions are
- dump the data to a csv file and load it in Mickeysoft SQL server
- use Oracle Heterogeneous services
- use Mickeysoft DTS
Whatever you prefer.
Sybrand Bakker
Senior Oracle DBA -
Hi,
We are looking into options that we have to load data into TimesTen. The data would be in data files.
We are looking at three options -- Custom jdbc program, Custom PL/SQL procedure or ttBulkCp.
Which would be a better option in terms of performance ? If we consider a data size of 100 million records, which one would perform better ?
Does anybody have any performance benchmarks for any of the same ?
Thanks and Regards,
PritomttBulkCp is usually pretty fast (especially in direct mode). However, it is a generc program so it does have to parse each input line. Depending on the im0plementation and file format, custom code might have to do less parsing and so might be a little faster. However, Java/JDBC is significantly slower thsan C/ODBC so YMMV. Custom C/ODBC code would likely be somewhat faster than ttBulkCp but whether the difference is worth the effort is hard to say. Custom PL/SQL is unlikely to be faster than direct mode ttBulkCp but I have never compared them.
I don't personally know of any benchmark numbers that compares these three methods (or any subset of them).
Note that as long as the loading mechanism is reasonably efficient then the hardware spec and TimesTen configuration will also be very important in obtaining optimal load performance.
Chris -
Unable to store log data into database through JDBCAppender of Log4j
I am able to store the log data into the file as well as to display that on console. But unable to store the same into the database. I am not getting any error or warning while execution. The code of log.properties is as below : -
log4j.rootLogger=ERROR, C, FILE
log4j.logger.org.firebird=ERROR, C
log4j.logger.org.firebirdsql=ERROR, C
log4j.logger.org.apache.joran=ERROR, C
log4j.logger.org.apache.log4j.joran.action=ERROR, C
log4j.appender.FILE=org.apache.log4j.FileAppender
log4j.appender.FILE.file=/log.txt
log4j.appender.FILE.layout=org.apache.log4j.PatternLayout
log4j.appender.FILE.layout.ConversionPattern=[%d{MMM dd HH:mm:ss}] %-5p (%F:%L) - %m%n
log4j.logger.org.apache.log4j.jdbcplus.examples=DEBUG, JDBC
# console appender
log4j.appender.C=org.apache.log4j.ConsoleAppender
log4j.appender.C.layout=org.apache.log4j.PatternLayout
log4j.appender.C.layout.ConversionPattern=%d [%t] %-5p %c %x - %m%n
# JDBC appender using custom handlers, 2a)
log4j.appender.JDBC=org.apache.log4j.jdbcplus.JDBCAppender
log4j.appender.JDBC.connector=org.apache.log4j.jdbcplus.examples.MySqlConnectionHandler
log4j.appender.JDBC.sqlhandler=org.apache.log4j.jdbcplus.examples.SqlHandler
log4j.appender.JDBC.dbclass=com.mysql.jdbc.Driver
log4j.appender.JDBC2.url=jdbc:mysql:172.22.15.131/3306:plugins?
log4j.appender.JDBC2.username=user18
log4j.appender.JDBC2.password=user18
log4j.appender.JDBC.buffer=1
log4j.appender.JDBC.commit=true
log4j.appender.JDBC.sql=INSERT INTO logtest (id, prio, iprio, cat, thread, msg, layout_msg, throwable, ndc, mdc, mdc2, info,
addon, the_date, the_time, the_timestamp, created_by) VALUES (@INC@, '@PRIO@', @IPRIO@, '@CAT@', '@THREAD@', '@MSG@',
'@LAYOUT:1@', '@THROWABLE@', '@NDC@', '@MDC:MyMDC@', '@MDC:MyMDC2@', 'info timestamp: @TIMESTAMP@', '@LAYOUT@', cast
('@LAYOUT:3@' as date), cast ('@LAYOUT:4@' as time), cast ('@LAYOUT:3@ @LAYOUT:4@' as timestamp), 'me')
log4j.appender.JDBC.layout=org.apache.log4j.PatternLayout
log4j.appender.JDBC.layout.ConversionPattern=%m
Please help me out.. As I got stuck...Hi,
This might help
http://avdeo.com/2008/05/21/uploading-excel-sheet-using-oracle-application-express-apex/
I think heading about that blog post is wrong. It is solution to import CSV.
But you can convert your Excels easilly to CSV.
I think import pure Excel is quite hard, and I have not seen any solutions.
See this post also
Importing Excel spreadsheet into Oracle via Apex
Br,Jari -
Hi,
I am new to this forum so pls. pardon me if I am not familier with the rules. I need to know how to integrate DB Adapter data into BPEL rules engine. Actually, based on a status column value in Oracle table this rules engine would guide it to a human task. I need help to do this.
Thx,Hi
I am exactly working on the same scenario as below and wanted to use the Oracle Apps Adapter
1. Import the CSV files using File Adapter.
2. Move the CSV data into Oracle Staging Table using Databse Adapter.
3. Validate the Record by PL/SQL Package that i publish as WebService.
4. Move the Validated Record to Oracle Base table using Apps Adapter.
I created a package for validating the data and call the Oracle Seeded APIs and registered the same as a concurrent program. Now when i am trying to find the same concurrent program through apps adapter I couldn't find.
Am I missing something? Or do I have only one option to call the Concurrent Program through DB Adapter?
Can any one help me out please -
How to extract Oracle data into Excel file?
For a small automation project I have to extract data from a table/
tables and append it to the existing excel file and feed that excel
file to a command that will load data into some other environment. I
am totally new to this. So to get started I wanted to know,
1) How to extract data from sample table Foo which has columns A,B,C
and append these values as new columns to an existing excel say
fooresults.csv ?
2) Can I achieve this in pl/sql script or do I need to write unix or
perl script or some other programing language, please advise?The "extract data from a table" part is easy, you could do that with VB/ADO, or .NET/ODP.NET. It's then a matter of taking that data and appending it to a spreadsheet that might be the hard part, and how you'd do that exactly is really more of a Microsoft question than an Oracle one.
If you want to be able to do this from the database itself and your database is on Windows, you could use either [.NET Stored Procedures|http://www.oracle.com/technology/tech/dotnet/ode/index.html] if you can manipulate the spreadsheet in .net code, or you could also use Oracle's [COM Automation Feature|http://www.oracle.com/technology/tech/windows/com_auto/index.html] if you're handy with the COM object model for Excel.
How you'd do that exactly via either .net or com or vb is the crux of the problem and is something you'd need to know before it turns into an Oracle question, but if you already know how to do that and now just need to figure out a way to do that from Oracle, either of the above might help.
Hope it helps,
Greg -
Outbound oracle data into xml file
Hi Odie,
Your previous inputs for xml inbound was working fine, in the similar way now I need outbound the oracle apps data into .xml file format. For that I've written one sample script like this. I thought of making use of the utl_file option in Oracle apps.
declare
l_log_handle UTL_FILE.FILE_TYPE;
l_log_name varchar2(50);
l_path_2 varchar2(40);
l_global_file varchar2(50);
l_time number:=1;
cursor cur1 is
select xmltype(cursor(select * from fnd_menus where rownum <101)) a from dual;
--select menu_id from fnd_menus where rownum<100;
begin
BEGIN
SELECT DECODE (INSTR (VALUE, ','),
0, VALUE,
SUBSTR (VALUE, 1, (INSTR (VALUE, ',', 1)) - 1)
INTO l_path_2
FROM v$parameter
WHERE NAME = 'utl_file_dir';
EXCEPTION
WHEN OTHERS THEN
dbms_output.put_line('Error while getting Unix Path ' || SQLERRM);
END;
l_log_name := 'XGBIZ_'||TO_CHAR (SYSDATE, 'YYMMDD')||l_time;
l_log_name := CONCAT(l_log_name,'.xml');
l_global_file := l_log_name;
l_log_handle := UTL_FILE.FOPEN(l_path_2,l_log_name,'W');
for cur2 in cur1 loop
UTL_FILE.PUT_LINE(l_log_handle, cur2);
end loop;
utl_file.fclose_all;
EXCEPTION
WHEN UTL_FILE.INVALID_OPERATION THEN
dbms_output.put_line('Invalid Operation For '|| l_global_file);
UTL_FILE.FCLOSE_ALL;
WHEN UTL_FILE.INVALID_PATH THEN
dbms_output.put_line('Invalid Path For '|| l_global_file);
UTL_FILE.FCLOSE_ALL;
WHEN UTL_FILE.INVALID_MODE THEN
dbms_output.put_line('Invalid Mode For '|| l_global_file);
UTL_FILE.FCLOSE_ALL;
WHEN UTL_FILE.INVALID_FILEHANDLE THEN
dbms_output.put_line('Invalid File Handle '|| l_global_file);
UTL_FILE.FCLOSE_ALL;
WHEN UTL_FILE.WRITE_ERROR THEN
dbms_output.put_line('Invalid Write Error '|| l_global_file);
UTL_FILE.FCLOSE_ALL;
WHEN UTL_FILE.READ_ERROR THEN
dbms_output.put_line('Invalid Read Error '|| l_global_file);
UTL_FILE.FCLOSE_ALL;
WHEN UTL_FILE.INTERNAL_ERROR THEN
dbms_output.put_line('Internal Error');
UTL_FILE.FCLOSE_ALL;
WHEN OTHERS THEN
dbms_output.put_line('Other Error '||'SQL CODE: '||SQLCODE||' Messg: '||SQLERRM);
UTL_FILE.FCLOSE_ALL;
end;
when running this script I am getting error
ERROR at line 30:
ORA-06550: line 30, column 2:
PLS-00306: wrong number or types of arguments in call to 'PUT_LINE'
ORA-06550: line 30, column 2:
PL/SQL: Statement ignored
if in the cursor declaration happen to use 'select menu_id from fnd_menus ' a plain record then it is successfully creating.
If tried again revert to actual select statement ' select xmltype(cursor(select * from fnd_menus where rownum <101)) a from dual'
then its erring out as above said.
Please give me your valuable inputs in this regard.
Thanks & Regards
NagendraHi,
There are multiple ways to generate XML documents from relational data.
Here are some :
-- SQL/XML functions : XMLElement, XMLAgg, XMLAttributes etc.
-- DBMS_XMLGEN package
select dbms_xmlgen.getXML('SELECT * FROM scott.emp')
from dual;-- XMLType constructor over a REF CURSOR (the one you chose)
select xmlserialize(document
xmltype(
cursor(
select *
from scott.emp
as clob
from dual;-- From a DBUriType
select xmlserialize(document
dburitype('/SCOTT/EMP').getXML()
as clob
from dual;-- From XQuery using ora:view function
select xmlserialize(document
xmlquery('<ROWSET>{ora:view("SCOTT","EMP")}</ROWSET>' returning content)
as clob indent size = 1
from dual;If a column is NULL in the result set, those methods (except XMLElement) won't create the corresponding element.
There's an option available for the XQuery method, but only in version 11.2.
So if you want to output empty elements, you'll have to use DBMS_XMLGEN with setNullHandling method :
DECLARE
ctx DBMS_XMLGEN.ctxHandle;
v_out CLOB;
rc SYS_REFCURSOR;
BEGIN
OPEN rc FOR
SELECT *
FROM scott.emp
ctx := DBMS_XMLGEN.newContext(rc);
DBMS_XMLGEN.setNullHandling(ctx, DBMS_XMLGEN.EMPTY_TAG);
v_out := DBMS_XMLGEN.getXML(ctx);
DBMS_XMLGEN.closeContext(ctx);
CLOSE rc;
DBMS_XSLPROCESSOR.clob2file(v_out, 'TEST_DIR', 'test_out.xml');
END;
I thought of making use of the utl_file option in Oracle apps.You could, but you might find DBMS_XSLPROCESSOR.clob2file procedure more convenient for that (see above).
All you have to do is serializing the XML in a CLOB variable, and call the procedure.
WHERE NAME = 'utl_file_dir';The "utl_file_dir" init. parameter is deprecated since 10g, use directory objects instead. -
Unable to Loda the data into PSA.
Hi Xpert,
i am uanble to load the data into PSA.
my source system is not R/3,it is BI.
(ctually we are xtracting the data (from a cube) with the help of programm to a table and then i am creating a generic data ource on that table and loading the data to my cube.)
i am getting this error message.
Error message from the source system
Diagnosis
An error occurred in the source system.
System Response
Caller 09 contains an error message.
Further analysis:
The error occurred in Extractor .
Refer to the error message.
Procedure
How you remove the error depends on the error message.
Note
If the source system is a Client Workstation, then it is possible that the file that you wanted to load was being edited at the time of the data request. Make sure that the file is in the specified directory, that it is not being processed at the moment, and restart the
Then i replicated the data source and activate ti also then also i am getting this error message.
when i am checking my data source on RSA3 i am getting this error message,
Two internal tables are neither compatible nor convertible.
t happened?
Error in the ABAP Application Program
The current ABAP program "SAPLAQBWEXR" had to be terminated because it has
come across a statement that unfortunately cannot be executed.
or analysis
You attempted to move one data object to another.
This is not possible here because the internal tables concerned
are neither compatible nor convertible.
gger Location of Runtime Error
Program SAPLAQBWEXR
Include LAQBWEXRU01
Row 419
Module type (FUNCTION)
Module Name AQBW_CALL_EXTRACTOR_QUERY
Regards,
sat534Hi
Problem looks to be with generic datasource
share details of data source and how you created it.
Regards
Sudeep -
Unable to load the data into Cube Using DTP in the quality system
Hi,
I am unable to load the data from PSA to Cube using DTP in the quality system for the first time
I am getting the error like" Data package processing terminated" and "Source TRCS 2LIS_17_NOTIF is not allowed".
Please suggest .
Thanks,
SatyaprasadHi,
Some Infoobjects are missing while collecting the transport.
I collected those objects and transported ,now its working fine.
Many Thanks to all
Regards,
Satyaprasad -
Need to archive Oracle data into MS Access database
Gurus,
I've been tasked with archiving several large tables (10 million records) into an MS Access database. I'm looking through MS Access help guide which states that I can use the ODBC driver to create table, column definitions, and import data.
Does anyone have any good resources on creating an ODBC driver for MS Access. Also, any reference material would be greatly appreciated.
Thanks
Scottsreese wrote:
Gurus,
I've been tasked with archiving several large tables (10 million records) into an MS Access database. On the face of it, that is an astoundingly bad idea. Using the worst database product (I can't even really refer to it as an RDBMS) on the planet to archive data that is already in the best RDBMS on the planet?
What does the person who came up with this idea expect to achieve?
I'm looking through MS Access help guide which states that I can use the ODBC driver to create table, column definitions, and import data. ODBC drivers don't create anything. They provide a common link between ODBC enabled apps and the native db interface.
>
Does anyone have any good resources on creating an ODBC driver for MS Access. You don't create the driver. It is provided by either MS or Oracle (they both have them). You just configure a connection definition (aka "Data Source Name" or DSN) that utilizes it.
Also, any reference material would be greatly appreciated.
Thanks
Scott -
Unable to load the data into HFM
Hello,
We created new HFM app configured that with FDM, generated output file through FDM and loaded that file through HFM directly 5-6 times, there was no issue till here.
Then I loaded the file through FDM 4 times successfully, even for different months. But, after 4 loads I start getting Error. Attached is the error log .
Please help us earliest..
** Begin fdmFM11XG6A Runtime Error Log Entry [2013-10-30-13:44:26] **
Error:
Code............-2147217873
Description.....System.Runtime.InteropServices.COMException (0x80040E2F): Exception from HRESULT: 0x80040E2F
at HSVCDATALOADLib.HsvcDataLoadClass.Load(String bstrClientFilename, String bstrClientLogFileName)
at fdmFM11XG6A.clsFMAdapter.fDBLoad(String strLoadFile, String strErrFile, String& strDelimiter, Int16& intMethod, Boolean& blnAccumFile, Boolean& blnHasShare, Int16& intMode)
Procedure.......clsHPDataManipulation.fDBLoad
Component.......E:\Opt\Shared\Apps\Hyperion\Install\Oracle\Middleware\EPMSystem11R1\products\FinancialDataQuality\SharedComponents\FM11X-G6-A_1016\AdapterComponents\fdmFM11XG6A\fdmFM11XG6A.dll
Version.........1116
Identification:
User............fdmadmin
Computer Name...EMSHALGADHYFD02
FINANCIAL MANAGEMENT Connection:
App Name........
Cluster Name....
Domain............
Connect Status.... Connection Open
Thanks,'
RaamWe are working with the DB team but they have confirmed that they is no issue with the TB, the process we have followed
As a standard process – while loading the data from FDM or manually to HFM – we don’t write any SQL query. Using the web interface – data would be loaded to HFM application. This data can we viewed by different reporting tools (smart view(excel)/HFR Report/etc.)
There is no any official documents on oracle website which talk about Insert SQL query which is used to insert data to HFM tables. Even, Hyperion does not provide much details on its internal tables used. Hyperion does not provide much insight on internal structure of HFM system.
As per Hyperion blogs/forums on internet –HFM stores the base level data in so called DCE tables (for example EMHFMFinal _DCE_1_2013 where EMHFMFinal is application name, 1 identifies the Scenario and 2013 the Year). Each row in the DCE table contains data for all periods of a given combination of dimensions (also called an intersection).
We are trying to load same data file with a replace option( it should delete the existing data before loading the data file). -
How to conver the oracle data into xml files
Hi All,
I have a table for ex emp, now i want to generate every row into an xml file. could anyone pls help...
ex:- emp table
eno ename sal
1 bond 3000
2 kiran 2000
3 jai 1000
4 henry 500
o/p :- i have to get a column in 4 different files for this 4 rows.
1.xml file should contain data <ID>1</ID><eNAME>bond</eNAME><sal>3000</sal>
2.xml file should contain data <ID>2</ID><eNAME>kiran</eNAME><sal>2000</sal>
3.xml file should contain data <ID>3</ID><eNAME>jai</eNAME><sal>1000</sal>
4.xml file should contain data <ID>1</ID><eNAME>bond</eNAME><sal>500</sal>
regards,
Badri.You can do it like this :
begin
for r in (
select empno
, xmlserialize(content xmlforest(empno as "ID", ename, sal)) as xmlcontent
from scott.emp
loop
dbms_xslprocessor.clob2file(r.xmlcontent, 'TEST_DIR', to_char(r.empno) || '.xml');
end loop;
end; -
How to convert Oracle Data into Java Date
Hi there,
in JDev 3.2 oracle.jbo.domain.Date was derived from oracle.sql.DATE, which has a timestampValue() method to get a java.sql.Timestamp.
In Jdev 9.0.3 Date is not!! derived from oracle.sql.DATE, so my previous implementation fails.
So, How can I get a java.util.Date from oracle.jbo.Date?
( I think this is a bug !?)Hi Dietmar,
this is documented in the help; see Reference - Business Components Oracle Domains; then choose 'Date'
using the dateValue() method on the oracle.jdo.domain.Date gives you a java.sql.Date which is a (subclass of) java.util.Date.
Cheers, Hans
Maybe you are looking for
-
How to upload Sales Orders into B1 from an excel file?
Hi, I'm a new comer. I'm looking for a program of uploading Sales Orders into B1 from an excel file. Where can I find a reference book for it? Or I appreciate if someone could offer a program sample. Thanks ZAP
-
X220 Tablet and Windows 8.1 - Limited Touch drops with Intel HD 3000 is installed?
I have installed Windows 8.1 Professional on on a Lenovo X220 Tablet and everything seems to have installed correctly, although my system, which supported limited touch with 2 touch points in Windows 7, only has Pen Support in 8.1. I have been able t
-
Need help to trace the place where error occuring in Web UI of type System
Hi All, Need help to trace the place where error occurring in Web UI of type System error, this error coming while saving the corporate account creation, error message description : - System error: Interruption in Routine READ TABLE GT_CHAR_VAL, CHAR
-
I was hoping to find an application that will back my iMessages and SMS directly to my e-mail so that I could view them on there. Does anyone know if an app like this exists? So far all I've been able to figure out is how to back everything up to i
-
Folks: Last week I had an episode of multiple recurring "application not responding" errors while trying to select and scroll down through a couple hundred pages of text in a manuscript of approx. 1.8 MB in size in my word processor, Mellel. As I scr