Concatenate Blobs ...
Hi,
I have two tables, each of them contain a blob column. My need is to concatenate the content of the two columns and store the new blob content in a new column (in one of the two tables or in an new table).
Can I have help for : from where can I start please ?
Thanks
for me this works just fine:
(tested on 10g XE)
SQL> create table testblob (id number, data blob);
Tabelle wurde erstellt.
SQL> insert into testblob (id, data) values (1, empty_blob());
1 Zeile wurde erstellt.
SQL> insert into testblob (id, data) values (2, empty_blob());
1 Zeile wurde erstellt.
SQL> declare
2 cursor cr is
3 select data
4 from testblob
5 for update of data nowait;
6 vcText VARCHAR2(30):='this is a text';
7 begin
8 for rec in cr loop
9 dbms_lob.open(rec.data, dbms_lob.LOB_READWRITE);
10 dbms_lob.write(rec.data, length(vcText), 1, utl_raw.cast_to_raw(vcText));
11 dbms_lob.close(rec.data);
12 end loop;
13 end;
14 /
PL/SQL-Prozedur erfolgreich abgeschlossen.
SQL> select dbms_lob.getlength(data) from testblob;
DBMS_LOB.GETLENGTH(DATA)
14
14
SQL> declare
2 cursor cr(i_nId in number) is
3 select data
4 from testblob
5 where id=i_nId
6 for update of data nowait;
7 blob1 BLOB;
8 blob2 BLOB;
9 begin
10 open cr(1);
11 fetch cr into blob1;
12 close cr;
13 open cr(2);
14 fetch cr into blob2;
15 close cr;
16
17 dbms_lob.open(blob1, dbms_lob.LOB_READWRITE);
18 dbms_lob.append(blob1, blob2);
19 dbms_lob.close(blob1);
20 end;
21 /
PL/SQL-Prozedur erfolgreich abgeschlossen.
SQL> select dbms_lob.getlength(data) from testblob;
DBMS_LOB.GETLENGTH(DATA)
28
14
SQL>
Similar Messages
-
How to (or can I even) concatenate BLOB
One practice for storing images files is creating a table with a column (named IMG) of the BLOB data type, then inserting each image into the table. The result is the table contains multiple records:
ROW1: img1
ROW2: img2
ROW3: img3
ROW4: img4
Now, for some reason I need to concatenate the image data (not combine them), just concatenate them one after another, into one record, then store it in the table, logically speaking, it will be like this: img1 | img2 | img2 | img4
When I need it later, I'll issue a SELECT statement such as
SELECT img into v_img from TABLE_NAME where id = '12345';
next I need to separate and restore the v_img back to its individual image, i.e., img1, img2, img3, and img4.
I just wonder can I even do this?
Thanks
ScottHello;
I don't believe that's possible in the database. A blob is just a set of binary-data. One set.
I would post more about your exact requirements.
Creating a Table Containing One or More LOB Columns
http://docs.oracle.com/cd/E11882_01/appdev.112/e18294/adlob_ddl.htm#i1007083
Oracle Database SecureFiles and Large Objects Developer's Guide:
http://docs.oracle.com/cd/E11882_01/appdev.112/e18294/toc.htm
LOB Storage Parameters
http://docs.oracle.com/cd/E11882_01/appdev.112/e18294/adlob_tables.htm#CIHEBABG
Multimedia User's Guide
http://docs.oracle.com/cd/E11882_01/appdev.112/e10777/toc.htm
Best Regards
mseberg
Edited by: mseberg on Mar 12, 2012 12:06 PM -
Inserting a LONG to a BLOB in the same table
I have a table with a LONG column and would like to concatenate all existing columns into a new column in the same table and define it as a BLOB. I'm trying to figure out a way to do this. I tried the following and got error message
insert into cma_search_test
select obj_id,
line_id,
doc_nmbr,
supplier_part_num,
mnfctr,
mnfctr_part_nmbr,
line_desc,
ext_desc_txt,
vend_name,
cma_effect_dt,
cma_expir_dt,
last_updt_usr,
last_updt_tmsp,
obj_id||
line_id||
doc_nmbr||
supplier_part_num||
mnfctr||
mnfctr_part_nmbr||
line_desc||
to_lob(ext_desc_txt)||
vend_name||
cma_effect_dt||
cma_expir_dt||
last_updt_usr||
last_updt_tmsp "All_Columns"
from cma_search
to_lob(ext_desc_txt)||
ERROR at line 22:
ORA-00932: inconsistent datatypes
Any help would be appreciated.
Thanks,
TracyYou cannot concatenate a LONG field. You will have to use a procedure to accomplish what you want.
Use the DBMS_LOB package. -
Join two source tables and replicat into a target table with BLOB
Hi,
I am working on an integration to source transaction data from legacy application to ESB using GG.
What I need to do is join two source tables (to de-normalize the area_id) to form the transaction detail, then transform by concatenate the transaction detail fields into a value only CSV, replicate it on the target ESB IN_DATA table's BLOB content field.
Based on what I had researched, lookup by join two source tables require SQLEXEC, which doesn't support BLOB.
What alternatives are there and what GG recommend in such use case?
Any helpful advice is much appreciated.
thanks,
XiaocunXiaocun,
Not sure what you're data looks like but it's possible the the comma separated value (CSV) requirement may be solved by something like this in your MAP statement:
colmap (usedefaults,
my_blob = @STRCAT (col02, ",", col03, ",", col04)
Since this is not 1:1 you'll be using a sourcedefs file, which is nice because it will do the datatype conversion for you under the covers (also a nice trick when migrating long raws to blobs). So col02 can be varchar2, col03 a number, and col04 a clob and they'll convert in real-time.
Mapping two tables to one is simple enough with two MAP statements, the harder challenge is joining operations from separate transactions because OGG is operation based and doesn't work on aggregates. It's possible you could end up using a combination of built in parameters and funcations with SQLEXEC and SQL/PL/SQL for more complicated scenarios, all depending on the design of the target table. But you have several scenarios to address.
For example, is the target table really a history table or are you actually going to delete from it? If just the child is deleted but you don't want to delete the whole row yet, you may want to use NOCOMPRESSDELETES & UPDATEDELETES and COLMAP a new flag column to denote it was deleted. It's likely that the insert on the child may really mean an update to the target (see UPDATEINSERTS).
If you need to update the LOB by appending or prepending new data then that's going to require some custom work, staging tables and a looping script, or a user exit.
Some parameters you may want to become familiar with if not already:
COLS | COLSEXCEPT
COLMAP
OVERRIDEDUPS
INSERTDELETES
INSERTMISSINGUPDATES
INSERTUPDATES
GETDELETES | IGNOREDELETES
GETINSERTS | IGNOREINSERTS
GETUPDATES | IGNOREUPDATES
Good luck,
-joe -
Concatenate multiple word documents into 1 long document in APEX
I have a requirement to concatenate multiple word documents into 1 long document in APEX but I'm not sure if it can be done
or where to begin.
I've been able to upload/download files in APEX and generate Word docs using BI Publisher, but I can't seem to find anyway to carryout the above.
Any feedback greatly received.
Thanks in advance.
KeithThanks for your reply Jari.
The files are never actually stored in the database as a BLOB or CLOB.
I'm basically trying to do the following:-
I have a third party Document Management System that stores a number of templates (.doc and .dot).
The files are stored in a directory on a file sever.
I have been tasked with the job of creating a Document Generator type app in APEX that will allow the users to select a number of these files and concatenate them together into 1 large 'Master' Document.
I was hoping to be able to do this without having to go down the line of using external software functions like Java but I'll check out your suggestion.
Cheers
Keith -
Link to file (blob) with 2 primary keys
I am experiencing a problem in APEX 3.1.2 with viewing files uploaded through a simple form based on a table. If that table has a single column primary key, the link works perfectly. If the table has a two column primary key, the link throws a "pls/apex/apex_util.get_blob_file not found on server". I made a simple test case and if I change the table to use a 1 column PK, (and change the process on the APEX form to match), the link works. When I change the PK back to 2 columns (and the APEX processes), the link stops working on the same record. Is this a known issue? I couldn't find anything useful in Metalink or these forums.
I may be missing something obvious, but could someone repeat this test case and let me know if you encounter the same thing?
1) Create table called test_blob with columns
id1 number
id2 number
file_blob blob
filename varchar2
mimetype varchar2
update_date date
and a primary key of id1 only
2) Create an APEX form based on that table and configure the file browse item source with the mimetype fields, filename field, .... (You'll need to unhide the id1 field unless you bothered creating a trigger,etc to populate it)
3) Upload the document, PDF, ... of your choice
4) Go back to the form on that record and click the download link. It should work to open the file
5) Change the primary key on the table to use id1 and id2 and change the processes on the APEX form to use the 2 column primary key
6) Repeat step 4, but now the link no longer works. If you toggle back to the 1 column PK, the link works again.
I worked around it by making a new column that concatenates id1 and id2 so I can move my application forward with a 1 column PK, but this is frustrating. BTW, I did try to use the get_blob_file_src API to work around the problem, but the link it generates is identical to that generated automatically so the same problem occurs.
Rgds/Mark M.It rather depends on what access you have and what the keys are, but for my table:
1) Drop existing primary key constraint
2) Add a new column called something like key_id
3) Change your trigger which currently sets your existing id to also set the new key_id by setting key_id := id1 || ' ' || id2;
4) Add new primary key constraint for new column
Note that to do these changes, you may need to empty the table.
Rgds/Mark M. -
ORA-22275 :invalid LOB locator specified error while loading BLOBs/CLOBS.
Hello All,
I am trying to load BLOB/CLOB data from a client oracle DB to the oracle db on our side,we are using ODI version 10.1.3.5.6,which reportedly has the issue of loading BLOB/CLOBS solved.I am using
The extraction fails in the loading stage when inserting data into the C$ table with the following error.
"22275:99999 :java.sql.BatchUpdateException:ORA-22275:Invalid LOB locator specified".
Kindly let me know how I can resolve this issue as the requirement to load this data is very urgent.
Thanks,
JohnOne alternate way can be done out of ODI as ODI is still not able to resolve this issue. You can trim these fields (CLOB/BLOB) and push this data as separate fields into ODI and at the reporting end you can again concatenate them.
May be this may solve your problem ....it solved mine.
--XAT -
Indexing multiple documents in one BLOB?
Hello,
I need to index a bundle of documents (PDF, DOC, PPT...) as a unit with one index and probably a key identifying the bundle. So that not every document must match the searchstring to be found by the user, but the whole bundle is searched for containing it.
The usecase is, that I have a bundle of documents that logicaly belong together. These bundles should be searched for a group of words returning the bundle that contains all of these words and not only the document containing all of them. So that, for example, also bundles are found where some words are in one document and the rest in a second one.
I hope someone's understanding what I mean.You can't really put them all into one BLOB colum, it would be pretty hard to work out where one document ended an another started. Instead, I would suggest you put the documents into a separate table, one per row with a "bundlenumber" column representing the links between them. Then have a table with one row per bundle, and create a user datastore which fetches docs from the doc table, filters them using ctx_doc.policy_filter (or ctx_doc.ifilter) and concatenates the filter output.
-
Concatenat​ing TDMS Files
Hello,
I'm writing a script in DIAdem 2011 which combines data from multiple sources into a single TDMS-File. Exporting into multiple TDMS files works fine( I can open them in DIAdem) but I'm having trouble combining all files to one TDMS-File. Speed is important, so I tried to avoid loading all data into the dataportal and saving again (this works though).
If I understood correctly it is possibly to directly concatenate the binary contents of the files into a single file. Is this correct? Because using the MS-Dos copy command or using a VBS-Script in DIAdem with Stream-Objects both results in TDMS-Files that I can't open anymore.
The error message says:
"TDS Exception in Intialize: Tds Error: TdsErrNotSupported(53):"
It seems only one of the files produces these problems. When I load this one file and save it again as TDMS I can sucessfully combine them and load the resulting TDMS without errors.
Perhaps the file is fragmented and this leads to the error (the offending file is measurement data)? Is this expected behaviour for fragmented files?
How can i efficiently combine these files? I have to be able to combine multiple hundreds of files ( in pairs. e.g. combine 400 TDMS in pairs to 200 TDMS files) fast and i guess opening 200 files, saving again and combining is rather slow.
Is there an efficient way to solve this task?
As the offending file is actually TDMS-Data from a BLOB-Datafield in a SQL-database another solution might be loading the TDMS directly from the database into the dataportal and saving from DIAdem, instead of dumping the BLOB-data into a TDMS-File on the HDD( i guess this results in a defragmented file as opening and saving resolves the problems).
But I'm not sure how to achieve this. I'm using ADO to interface with the SQL-DB. The code looks approximately like this:
Set recordSet = CreateObject("ADODB.Recordset")
recordSet.Source = 'QUERY
recordSet.open
Set fso = CreateObject("Scripting.FileSystemObject")
'Create Stream object
Set BinaryStream = CreateObject("ADODB.Stream")
do
chunkSize=100
lngOffset=0
lngFileSize=recordSet.Fields.Item("Field").ActualSize
'Open the stream And write binary data To the object
BinaryStream.Open
do while lngOffset<lngFileSize
varChunk=RecordSet.Fields.Item("TDMS_DataInBLOB").GetChunk(chunkSize)
Call BinaryStream.Write(varChunk)
lngOffset=lngOffset+chunkSize
loop
'Save binary data To disk
Call BinaryStream.SaveToFile("path", adSaveCreateOverWrite)
BinaryStream.Close
Call RecordSet.MoveNext
loop until RecordSet.EOF
How can I change this code to write the data into the dataportal?
Thanks in adavance.
Best regards,
grmumeHi Brad Turpin,
I have already tried concatenating the files with the copy command. I don't have a hex-editor at work but by looking at the filesizes and checking the contents with a texteditor the script seems to work but I can't open the created file.
The script I used looks like this:
private Sub mergeFiles(filePath1,filePath2,outFilePath,bDeleteInput)
if bDeleteInput then
Call ExtProgram("cmd.exe","/Q /c copy /b """&filePath1&""" + """&filePath2&""" /b """&outFilePath&""" &del """&filePath1&"""&del """&filePath2&"""")
else
Call ExtProgram("cmd.exe","/Q /c copy /b """&filePath1&""" + """&filePath2&""" /b """&outFilePath&"""")
end if
End Sub
To test your script I defined the following variables:
NewFileName="C:\Documents and Settings\username\Desktop\new folder\combined.TDMS"
ChanNames(0)="file1"
ChanNames(1)="file2"
Folder="C:\Documents and Settings\username\Desktop\new folder"
DosDrv="C:\Documents and Settings\username\Desktop\BatchFile"
It completed sucessfully but the result is the same. I can open the two seperate files(file1.tdms and file2.tdms) but I cannot open the combined.tdms.
Sadly I can't share the original files and I can't use DIAdem to fill the files with dummydata because opening in DIAdem and saving resolves the problems.
Do you have an idea what causes these problems? The possibility to combine tdms-files like that was a big bonus for me...
Can I somehow fill the files with dummydata without opening them in DIAdem? Or tell DIAdem to keep the original file structure but change the values?
Best regards,
grmume -
CLOB variable: concatenate or append?
I just wrote a FUNCTION to return XML from a web service. As the text can go over 32,767 characters, i figured a CLOB was the way to go:
FUNCTION Get_XML(I_URL VARCHAR2)
RETURN XMLTYPE
AS
Page CLOB;
Response UTL_HTTP.HTML_PIECES;
BEGIN
Response := UTL_HTTP.REQUEST_PIECES(I_URL);
-- UTL_HTTP.REQUEST_PIECES returns 2000 byte chunks.
FOR Chunk IN 1..Response.Count
LOOP
Page := Page || Response(Chunk);
END LOOP;
RETURN XMLTYPE(Page);
END Get_XML;My question is, is about the best way to concatenate the pieces. Is Page := Page || Piece(Counter); good, or should i be using DBMS_LOB.APPEND or similar?Also, which is better, APPEND or WRITEAPPEND? They are different, each has it's own specific functionality, so instead of 'better', you mean 'suitable for my requirement'.
See the Online Oracle Documentation regarding APPEND or WRITEAPPEND:
"APPEND Procedures
This procedure appends the contents of a source internal LOB to a destination LOB."
http://docs.oracle.com/cd/B19306_01/appdev.102/b14258/d_lob.htm#sthref2972
"WRITEAPPEND Procedures
This procedure writes a specified amount of data to the end of an internal LOB."
http://docs.oracle.com/cd/B19306_01/appdev.102/b14258/d_lob.htm#sthref3230
Homepages:
http://www.oracle.com/pls/db102/homepage
http://www.oracle.com/pls/db112/homepage
If APPEND, does it make sense to use TO_CLOB() on the piece for explicit conversion?APPEND expects a CLOB (or BLOB) as input, yes.
And one more example from Tim Hall
http://www.oracle-base.com/articles/misc/retrieving-html-and-binaries-into-tables-over-http.php -
Need help to open a blob from a report into another tab or browser window.
Hello everyone.
I'm looking for a bit of guidance on something I'm trying to do in Apex.
I have a report that contains a blob. Users can click on the link to open it in same browser. I also know how to make it download if they click the link. But what I really want to do is to click the link and open the blob in another tab or browser window.
Thank you in advance for your help.
Dw
I should have noted I'm using version 3.2
Edited by: DW Brown on Feb 22, 2012 3:13 PMDW Brown wrote:
It becomes a link from the column format section..
DOWNLOAD:<tablename>:<column>:ID::MIMETYPE:FILENAME:LAST_UPDATE_DATE::inline:Click Here
So far I haven't found a way to use something link "target=_blank"One way would be to use a Dynamic Action to apply the <tt>target="_blank"</tt> attribute to each link, or convert them to use APEX pop-ups or a jQuery lightbox like fancyBox.
Create an example on apex.oracle.com if you need more assistance. -
Help, how to open and display blobs from tables
Dear all,
I am trying to store ms-word files on a table using a blob column.
Does anyone how to open the files and display them from a form using 9iAS?
Thank you.
Carlos.And there may be, but you won't likely find that here. Do some time searching Google and maybe you'll find code that someone was nice enough to make freely available, although I wouldn't count on it. Were i a programmer and took the time to read those docs and write the code, I'd want to be paid for my time. But there are a lot of programmers who swear by freeware! You may get lucky.
-
How to retrieve all the data from a BLOB using view-generated accessor
I am using JDeveveloper 10g v. 10.1.3 and am storing an image in a database as a blob object and need to retrieve all of the data to get the entire image and store it in an ImageIcon. The code I have works partially in that it retrieves the correct data, but only gets a piece of it, leaving me with a partial image.
AppModuleImpl am;
ImageVwViewImpl vo;
am = (AppModuleImpl)panelBinding.getDataControl().getDataProvider();
vo = (ImageVwViewImpl)am.findViewObject("ImageVwView");
ImageVwViewRowImpl ivo = (ImageVwViewRowImpl)vo.getCurrentRow();
ImageIcon icon = new ImageIcon(ivo.getImage().getBytes(1, (int)ivo.getImage().getBufferSize()));
jULabel1.setIcon(icon);I either need to know how to use a stream to get the data out (from BlobDomain method getBinaryStream()), or how to get the other chunks of data separately.
edit: I know the problem is that getBufferSize() returns an int which is too small to hold all the data, but need to know what to use instead. Thanks!This is the code I'm using now. Same problem :(
AppModuleImpl am;
ImageVwViewImpl vo;
am = (AppModuleImpl)panelBinding.getDataControl().getDataProvider();
vo = (ImageVwViewImpl)am.findViewObject("ImageVwView");
ImageVwViewRowImpl ivo = (ImageVwViewRowImpl)vo.getCurrentRow();
ImageIcon icon = new ImageIcon(ivo.getImage().toByteArray());
jULabel1.setIcon(icon); -
Error while importing a table with BLOB column
Hi,
I am having a table with BLOB column. When I export such a table it gets exported correctly, but when I import the same in different schema having different tablespace it throws error
IMP-00017: following statement failed with ORACLE error 959:
"CREATE TABLE "CMM_PARTY_DOC" ("PDOC_DOC_ID" VARCHAR2(10), "PDOC_PTY_ID" VAR"
"CHAR2(10), "PDOC_DOCDTL_ID" VARCHAR2(10), "PDOC_DOC_DESC" VARCHAR2(100), "P"
"DOC_DOC_DTL_DESC" VARCHAR2(100), "PDOC_RCVD_YN" VARCHAR2(1), "PDOC_UPLOAD_D"
"ATA" BLOB, "PDOC_UPD_USER" VARCHAR2(10), "PDOC_UPD_DATE" DATE, "PDOC_CRE_US"
"ER" VARCHAR2(10) NOT NULL ENABLE, "PDOC_CRE_DATE" DATE NOT NULL ENABLE) PC"
"TFREE 10 PCTUSED 40 INITRANS 1 MAXTRANS 255 STORAGE(INITIAL 65536 FREELISTS"
" 1 FREELIST GROUPS 1 BUFFER_POOL DEFAULT) TABLESPACE "TS_AGIMSAPPOLOLIVE030"
"4" LOGGING NOCOMPRESS LOB ("PDOC_UPLOAD_DATA") STORE AS (TABLESPACE "TS_AG"
"IMSAPPOLOLIVE0304" ENABLE STORAGE IN ROW CHUNK 8192 PCTVERSION 10 NOCACHE L"
"OGGING STORAGE(INITIAL 65536 FREELISTS 1 FREELIST GROUPS 1 BUFFER_POOL DEF"
"AULT))"
IMP-00003: ORACLE error 959 encountered
ORA-00959: tablespace 'TS_AGIMSAPPOLOLIVE0304' does not exist
I used the import command as follows :
imp <user/pwd@conn> file=<dmpfile.dmp> fromuser=<fromuser> touser=<touser> log=<logfile.log>
What can I do so that this table gets imported correctly?
Also tell me "whether the BLOB is stored in different tablespace than the default tablespace of the user?"
Thanks in advance.Hello,
U can either
1) create a tablespace with the same name in destination where you are trying to import.
2) get the ddl of the table, modify the tablespace name to reflect the existing tablespace name in destination and run the ddl in the destination database, and run your import command with option ignore=y--> which will ignore all the create errors.
Regards,
Vinay -
Error while trying to store PDF in Oracle's BLOB field
Hi folks!
I'm having problems while trying to store PDF file in a BLOB field of an Oracle table
This is the message code:
Exception in thread "main" java.sql.SQLException: Data size bigger than max size for this type: 169894
at oracle.jdbc.dbaccess.DBError.throwSqlException(DBError.java:134)
at oracle.jdbc.dbaccess.DBError.throwSqlException(DBError.java:179)
at oracle.jdbc.ttc7.TTCItem.setArrayData(TTCItem.java:95)
at oracle.jdbc.dbaccess.DBDataSetImpl.setBytesBindItem(DBDataSetImpl.java:2414)
at oracle.jdbc.driver.OraclePreparedStatement.setItem(OraclePreparedStatement.java:1134)
at oracle.jdbc.driver.OraclePreparedStatement.setBytes(OraclePreparedStatement.java:2170)
at vgoactualizacion.Main.main(Main.java:84)
This is the piece of code i'm using:
Assuming conn = conection to Oracle 10g Database ( it's working )
String miProcedimientoAlmacenado = "{ call package_name.update_client(?,?, ?) }";
CallableStatement miComando = conn.prepareCall(miProcedimientoAlmacenado);
miComando.setString(1,miClienteID ); //first parameter : IN
//second parameter : IN
File miPDF = new File(pathPDF + "//" + miClienteID + ".pdf");
byte[] bodyIn = readFully(miPDF); //readFully procedure is shown below
miComando.setBytes(2, bodyIn); //THIS IS THE LINE WHERE THE ERROR IS PRODUCED
//3rd parameter: OUT
miComando.registerOutParameter(3, java.sql.Types.VARCHAR);
miComando.execute();
private static byte[] readFully(File miPDF) throws FileNotFoundException, IOException {
FileInputStream fis = new FileInputStream(miPDF);
byte[] tmp = new byte[1024];
byte[] data = null;
int sz, len = 0;
while ((sz = fis.read(tmp)) != -1) {
if (data == null) {
len = sz;
data = tmp;
} else {
byte[] narr;
int nlen;
nlen = len + sz;
narr = new byte[nlen];
System.arraycopy(data, 0, narr, 0, len);
System.arraycopy(tmp, 0, narr, len, sz);
data = narr;
len = nlen;
if (len != data.length) {
byte[] narr = new byte[len];
System.arraycopy(data, 0, narr, 0, len);
data = narr;
return data;
}//fin readFully
This approach indicates that the PDF file is converted into an array of bytes, then is stored as binary in the BLOB field.
Thanx in advance, and looking forward for your comments.You will probably need to use the setBinaryStream() method instead of setBytes().
Maybe you are looking for
-
How to get data from URL in a PL/SQL procedure
Hi!<br> <br> I want to pass values in APEX from a report with a link to a PL/SQL procedure through URL. How can I make this?<br> <br> For example:<br> <br> I have a report:<br> <br> select<br> id,<br> name,<br> akt,<br> case<br> whe
-
VGA CRT will not "wake up"... help please. Reset display settings?
Hello everybody, I have a mirror drive doors G4 Tower, Dual 1.25 GHz. I have an ADC and a DVI out from the computer. Just had damage to my old studio display so I am attempting to use just one CRT running from the DVI... w/ a DVI to VGA adapter. The
-
Hi All, Using Oracle 10gR2 (64 Bit) on RHEL 5.5. Using some customized applications. Getting frequently blocked sessions. Understand it's a application logic issue, but from the database side we can monitor the sessions. The holding or the blocking s
-
Importing FCP7 projects in FCPX 10.1.1
Hello I just installed Mavericks and FCPX 10.1.1 on another boot drive to test if I can handle the switch to FCPX. Is it possible to import any FCP7 projects into FCPX ? Thought it's best to ask before trying it and messing something up. They are pre
-
My Windows XP failed to install Adobe Reader XI Int Explorer script error?
On my Windows XP, I failed to install Adobe Reader XI because of Internet Explorer script error - how can I overcome this?