Searching and parsing each row
I have two set of txt files.
1) Links - 1 million records
2) Projects - 500 records
I need to know how to search in the link if a project is present in that link and map it back to the projects.
For e.g
LINKS
11X3 O2W HUA00 ETH-1.5M001
11X8 O2U D1000 MUL0001
11X8 O2U D1000 MUL0002
PROJECTS
11X3
11X8
02W
02U
So as you can Link1 has two sites which is divided into two columns SiteA =11X3 SiteB =02W
And goes on....
There are some rubbish data which is no way there is any site in this
[email protected] xxx001
[email protected] xxx001
[email protected] xxx001
[email protected]
which can be ignored.
Is there a parsing mechanism to go through row by row in oracle.How to start on this?
Teh position of teh siteA,SiteB in the table_link can be anywhere.Then my example is not useful (judging from the sample data I assumed the two sites can be extracted.
As a simple approach you could run (several times)
select :project || '*' the_project,the_row
from (select '11X3 O2W HUA00 ETH-1.5M001' the_row from dual union all
select '11X8 O2U D1000 MUL0001' the_row from dual union all
select '11X8 O2U D1000 MUL0002' the_row from dual
where instr(the_row,:project) > 0specifying :project as a group ('11X' instead of the complete projects '11X3','11X8', ...) being careful to get the least from the rubbish
Having the rows from the table_links numbered, you could create a two column table project & links_row_id and then write a procedure containing an insert into that_table as select_as_above with :project as a parameter and submit that as a job. Then you just call that procedure 500 times.
Regards
Etbin
Similar Messages
-
Search and Parse Text from Input.
Hi all,
i have a script which takes varchar2 as input. A varchar2 input will have filter criteria. for my dynamic sql. I am just appending the input and getting desired outpout.
But one scenario....i have to put some part of input into subquery and another part of input in mail query.
example
select name from emp
where 1=1
and name='XZ'
and exists (select 1 from dept where 1=1
and dept='Demo')
for above query i will get input as 'and name=''XZ'' and dept=''Demo'''
i want to search for "and dept='Demo' and parse that.
Difficulties
the input can be in any order. name condition can come first or name condition can come in second place.
How to generalise the search and parse condition for above input.
ThanksCan this be your solution,
SQL> select Ename from emp
2 where 1=1
3 &a
4 and exists (select 1 from dept where 1=1
5 and &b)
6 /
Enter value for a: and ename like '%A%'
old 3: &a
new 3: and ename like '%A%'
Enter value for b: DEPTNO = 10
old 5: and &b)
new 5: and DEPTNO = 10)
ENAME
ALLEN
WARD
MARTIN
BLAKE
CLARK
ADAMS
JAMES
7 rows selected. -
Retrieve 3 rows of data and put each row in different position on screen
I need to retrieve 3 consecutive rows of data (easy) and need to put each row's data in an entirely different location on the home page
for example,
row 1 needs to go up in the top left content box
row 2 needs to go in the large content area in the middle of the page
row 3 needs to be put down in the footer of the page
what's the most efficient way to do this ?Ok, I've tried to research this a bit more, but I'm no further forward
So if I have a query...
<cfquery name="getData" datasource="foo">
SELECT *
FROM 00_pricelist
</cfquery>
<cfoutput query="GetData" startrow=1 maxrows=3 >
<!--- surrounding table 1 --->
#getData.uneak_code[1]#<br />
<!--- surrounding table 2 --->
#getData.uneak_code[2]#<br />
<!--- surrounding table 3 --->
#getData.uneak_code[3]#<br />
</cfoutput>
can you tell me what I'm doing wrong ? -
Make select for each row - another solution?
For example I have a function with day before selection:
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
function ABS return boolean is
ret number(10,2);
begin
for sab in (select saldod from ndv_ccli, nd_clientes where
(:p_dtvenc-dtdoc) = (select min(:p_dtvenc-dtdoc) from ndv_ccli, nd_clientes
where (:p_dtvenc-dtdoc>0)and
ndv_ccli.cod_cli = cod_cli) and
ndv_ccli.cod_cli = cod_cli order by dtdoc desc,cddb )
loop
ret := sab.saldod;
end loop;
return (ret);
end;
<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<
In report I use the value of this function(a) in another function(b) to sum with few values.
Output of function(b) has a lot of rows and for each row it makes select in function(a) to find the value, but actualy it is the same. Is it possible to do this select only one time and then put in function(b) only it's value? Because it makes report too slow...
tnx beforeThanx, but while compiling it says:
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
REP-0749: After form trigger cannot reference a report column.
<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<< -
Hi.
I have a .csv file on my filesystem.
Could you post few code to read it from filesystem and parse every rows in it?
Thanks a lot.External tables are far easier to use... e.g.
I have a file on my server in a folder c:\mydata called test.txt which is a comma seperated file...
1,"Fred",200
2,"Bob",300
3,"Jim",50As sys user:
CREATE OR REPLACE DIRECTORY TEST_DIR AS "c:\mydata";
GRANT READ, WRITE ON DIRECTORY TEST_DIR TO myuser;Note: creates a directory object, pointing to a directory on the server and must exist on the server (it doesn't create the physical directory).
As myuser:
SQL> CREATE TABLE ext_test
2 (id NUMBER,
3 empname VARCHAR2(20),
4 rate NUMBER)
5 ORGANIZATION EXTERNAL
6 (TYPE ORACLE_LOADER
7 DEFAULT DIRECTORY TEST_DIR
8 ACCESS PARAMETERS
9 (RECORDS DELIMITED BY NEWLINE
10 FIELDS TERMINATED BY ","
11 OPTIONALLY ENCLOSED BY '"'
12 (id,
13 empname,
14 rate
15 )
16 )
17 LOCATION ('test.txt')
18 );
Table created.
SQL> select * from ext_test;
ID EMPNAME RATE
1 Fred 200
2 Bob 300
3 Jim 50
SQL>
{code}
http://www.morganslibrary.org/reference/externaltab.html -
Hi all,
is there some procedure or function or whatever I can use to read and parse an XML file into Oracle?
The XML contains 30+ Megs of data, which would represent data in 10+ tables if you would convert it into flat files.
I know I can read the XML and save it line by line in a table, but I do not want to parse it myself. It would be cool if there was a tool or so that would read the dtd definition of the XML, then read the XML data, parse it and create temporary tables with the flat content of the xml (can you code object oriented with PL/SQL? this would even be cooler!).
Does someone have an idea how to do that?
Thanks,
SteffWow, I've never tried to parse an XML file quite that large
before. Does it contain a bunch of encoded binary data, or does it
have houndreds of thousands of xml-nodes?
In any event, I do have one thought:
Do you need absolutely everything in the file? If you only
need access to a small portion of it, it would be worth your while
to pull out only the "stuff" that you need before you parse it. You
could use regular expressions to strip out things you don't need,
or to pull out only the stuff you do need.
The more you can minimize the "parse" effort the better off
you'll be.
Alternatively, if it's a file describing many "records" of
the same type, it would be best if you could "chunk" the file and
parse each record individually. -
Memory leak/overload when looping by index over a large query and updating each DB record
I am importing a CSV file into a temporary table and then running a select query that joins my actual database table with the temporary table, looking for any changes in the data. If changes exist, the select query is looped from 1 to #recordCount# and an update is applied to each record via cfquery. It runs very quickly (much more quickly than looping the query itself), but my memory spikes and overloads after about 1500 updates are completed. I need to be able to do upwards of 20000 at a time, without killing my system. I have tried manually setting the runtime garbage collection to trigger after X number of loops, but that doesn't seem to help. I am running CF8. See below for loop example:
<cfloop from="1" to="#updatedRecordsQuery.recordCount#" index="a">
<cftry>
<cfquery datasource="#db#" name="doUpdate">
UPDATE
CI
SET
firstname = <cfqueryparam cfsqltype="cf_sql_varchar" value="#updatedRecordsQuery.firstname[a]#" />,
lastname = <cfqueryparam cfsqltype="cf_sql_varchar" value="#updatedRecordsQuery.lastname[a]#" />,
etc, for about 15 various fields
FROM
client_info CI
WHERE
CI.client_id = <cfqueryparam cfsqltype="cf_sql_integer" value="#updatedRecordsQuery.client_id[a]#" />
</cfquery>
<cfcatch type="database">
<cfset local.updateErrorList = listappend(local.updateErrorList,updatedRecordsQuery.client_id[a]) />
<cfset local.error = true />
</cfcatch>
</cftry>
</cfloop>I would suggest to use select update instead of looping over query object and update each row one-by-one.
Procedure:
- Insert your CSV data into temp table.
- Use a select update SQL query to update the changed data instead of looping over a select query.
Example:
UPDATE
Table
SET
Table.col1 = other_table.col1,
Table.col2 = other_table.col2
FROM
Table
INNER JOIN
other_table
ON
Table.id = other_table.id
NOTE: You can put all your scripts in a Procedure. -
I have been searching high and low for this one. I have a vbscript that can successfully perform the function if one file is listed. It does a Wscript.echo on the results and if I run this via command using cscript, I can output to a text file
that way. However, I cannot seem to get it to work properly if I want it to search ALL the files in the folder. At one point, I was able to have it create the output file and appear as if it worked, but it never showed any results when the script
was executed and folder was scanned. So I am going back to the drawing board and starting from the beginning.
I also have a txt file that contains the list of string text entries I would like it to search for. Just for testing, I placed 4 lines of sample text and one single matching text in various target files and nothing comes back. The current script
I use for each file has been executed with a few hundred string text lines I want it to search against to well over one thousand. It might take awhile, but it works every time. The purpose is to let this run against various log files in a folder and
let it search. There is no deleting, moving, changing of either the target folder/files to run against, nor of the file that contains the strings to search for. It is a search (read) only function, going thru the entire contents of the folder and
when done, performs the loop function and onto the next file to repeat the process until all files are searched. When completed, instead of running a cscript to execute the script and outputting the results to text, I am trying to create that as part
of the overall script. Saving yet another step for me to do.
My current script is set to append to the same results file and will echo [name of file I am searching]: No errors found. Otherwise, the
output shows the filename and the string text that matched. Because the results append to it, I can only run the script against each file separately or create individual output names. I would rather not do that if I could include it all in one.
This would also free me from babysitting it and running each file script separately upon the other's completion. I can continue with my job and come back later and view the completed report all in one. So
if I could perform this on an entire folder, then I would want the entries to include the filename, the line number that the match occurred on in that file and the string text that was matched (each occurrence). I don't want the entire line to be listed
where the error was, just the match itself.
Example: (In the event this doesn't display correctly below, each match, it's corresponding filename and line number all go together on the same line. It somehow posted the example jumbled when I listed it)
File1.txt Line 54
Job terminated unexpectedly
File1.txt Line 58 Process not completed
File1.txt
Line 101 User input not provided
File1.txt
Line 105 Process not completed
File2.txt
No errors found
File3.txt
Line 35 No tape media found
File3.txt
Line 156 Bad surface media
File3.txt Line 188
Process terminated
Those are just random fake examples for this post.
This allows me to perform analysis on a set of files for various projects I am doing. Later on, when the entire search is completed, I can go back to the results file and look and see what files had items I wish to follow up on. Therefore, the
line number that each match was found on will allow me to see the big picture of what was going on when the entry was logged.
I actually import the results file into a spreadsheet, where further information is stored regarding each individual text string I am using. Very useful.
If you know how I can successfully achieve this in one script, please share. I have seen plenty of posts out there where people have requested all different aspects of it, but I have yet to see it all put together in one and work successfully.
Thanks for helping.I'm sorry. I was so consumed in locating the issue that I completely overlooked posting what exactly I was needing help with. I did have one created, but I came across one that seemed more organized than what I originally created. Later
on I would learn that I had an error in log location on my original script and therefore thought it wasn't working properly. Now that I am thinking that I am pretty close to achieving what I want with this one, I am just going to stick with it.
However, I could still use help on it. I am not sure what I did not set correctly or perhaps overlooking as a typing error that my very last line of this throws an "Expected Statement" error. If I end with End, then it still gives same
results.
So to give credit where I located this:
http://vbscriptwmi.uw.hu/ch12lev1sec7.html
I then adjusted it for what I was doing.
What this does does is it searches thru log files in a directory you specify when prompted. It looks for words that are contained in another file; objFile2, and outputs the results of all matching words in each of those log files to another file: errors.log
Once all files are scanned to the end, the objects are closed and then a message is echoed letting you know (whether there errors found or not), so you know the script has been completed.
What I had hoped to achieve was an output to the errors.log (when matches were found) the file name, the line number that match was located on in that file and what was the actual string text (not the whole line) that matched. That way, I can go directly
to each instance for particular events if further analysis is needed later on.
So I could use help on what statement should I be closing this with. What event, events or error did I overlook that I keep getting prompted for that. Any help would be appreciated.
Option Explicit
'Prompt user for the log file they want to search
Dim varLogPath
varLogPath = InputBox("Enter the complete path of the logs folder.")
'Create filesystem object
Dim oFSO
Set oFSO = WScript.CreateObject("Scripting.FileSystemObject")
'Creates the output file that will contain errors found during search
Dim oTSOut
Set oTSOut = oFSO.CreateTextFile("c:\Scripts\errors.log")
'Loop through each file in the folder
Dim oFile, varFoundNone
VarFoundNone = True
For Each oFile In oFSO.GetFolder(varLogPath).Files
'Verifies files scanned are log files
If LCase(Right(oFile.Name,3)) = "log" Then
'Open the log file
Dim oTS
oTS = oFSO.OpenTextFile(oFile.Path)
'Sets the file log that contains error list to look for
Dim oFile2
Set oFile2 = oFSO.OpenTextFile("c:\Scripts\livescan\lserrors.txt", ForReading)
'Begin reading each line of the textstream
Dim varLine
Do Until oTS.AtEndOfStream
varLine = oTS.ReadLine
Set objRegEx = CreateObject("VBScript.RegExp")
objRegEx.Global = True
Dim colMatches, strName, strText
Do Until oErrors.AtEndOfStream
strName = oFile2.ReadLine
objRegEx.Pattern = ".{0,}" & strName & ".{0,}\n"
Set colMatches = objRegEx.Execute(varLine)
If colMatches.Count > 0 Then
For Each strMatch in colMatches
strText = strText & strMatch.Value
WScript.Echo "Errors found."
oTSOut.WriteLine oFile.Name, varLine.Line, varLine
VarFoundNone = False
Next
End If
Loop
oTS.Close
oFile2.Close
oTSOut.Close
Exit Do
If VarFoundNone = True Then
WScript.Echo "No errors found."
Else
WScript.Echo "Errors found. Check logfile for more info."
End If
End if -
SQL merge and after insert or update on ... for each row fires too often?
Hello,
there is a base table, which has a companion history table
- lets say USER_DATA & USER_DATA_HIST.
For each update on USER_DATA there has to be recorded the old condition of the USER_DATA record into the USER_DATA_HIST (insert new record)
- to have the history of changes to USER_DATA.
The first approach was to do the insert for the row trigger:
trigger user_data_tr_aiu after insert or update on user_data for each rowBut the performance was bad, because for a bulk update to USER_DATA, there have been individual inserts per records.
So i tried a trick:
Instead of doing the real insert into USER_DATA_HIST, i collect the USER_DATA_HIST data into a pl/sql collection first.
And later i do a bulk insert for the collection in the USER_DATA_HIST table with stmt trigger:
trigger user_data_tr_ra after insert or update on user_dataBut sometimes i recognize, that the list of entries saved in the pl/sql collection are more than my USER_DATA records being updated.
(BTW, for the update i use SQL merge, because it's driven by another table.)
As there is a uniq tracking_id in USER_DATA record, i could identify, that there are duplicates.
If i sort for the tracking_id and remove duplicate i get exactly the #no of records updated by the SQL merge.
So how comes, that there are duplicates?
I can try to make a sample 'sqlplus' program, but it will take some time.
But maybe somebody knows already about some issues here(?!)
- many thanks!
best regards,
FrankHello
Not sure really. Although it shouldn't take long to do a test case - it only took me 10 mins....
SQL>
SQL> create table USER_DATA
2 ( id number,
3 col1 varchar2(100)
4 )
5 /
Table created.
SQL>
SQL> CREATE TABLE USER_DATA_HIST
2 ( id number,
3 col1 varchar2(100),
4 tmsp timestamp
5 )
6 /
Table created.
SQL>
SQL> CREATE OR REPLACE PACKAGE pkg_audit_user_data
2 IS
3
4 PROCEDURE p_Init;
5
6 PROCEDURE p_Log
7 ( air_UserData IN user_data%ROWTYPE
8 );
9
10 PROCEDURE p_Write;
11 END;
12 /
Package created.
SQL> CREATE OR REPLACE PACKAGE BODY pkg_audit_user_data
2 IS
3
4 TYPE tt_UserData IS TABLE OF user_data_hist%ROWTYPE INDEX BY BINARY_INTEGER;
5
6 pt_UserData tt_UserData;
7
8 PROCEDURE p_Init
9 IS
10
11 BEGIN
12
13
14 IF pt_UserData.COUNT > 0 THEN
15
16 pt_UserData.DELETE;
17
18 END IF;
19
20 END;
21
22 PROCEDURE p_Log
23 ( air_UserData IN user_data%ROWTYPE
24 )
25 IS
26 ln_Idx BINARY_INTEGER;
27
28 BEGIN
29
30 ln_Idx := pt_UserData.COUNT + 1;
31
32 pt_UserData(ln_Idx).id := air_UserData.id;
33 pt_UserData(ln_Idx).col1 := air_UserData.col1;
34 pt_UserData(ln_Idx).tmsp := SYSTIMESTAMP;
35
36 END;
37
38 PROCEDURE p_Write
39 IS
40
41 BEGIN
42
43 FORALL li_Idx IN INDICES OF pt_UserData
44 INSERT
45 INTO
46 user_data_hist
47 VALUES
48 pt_UserData(li_Idx);
49
50 END;
51 END;
52 /
Package body created.
SQL>
SQL> CREATE OR REPLACE TRIGGER preu_s_user_data BEFORE UPDATE ON user_data
2 DECLARE
3
4 BEGIN
5
6 pkg_audit_user_data.p_Init;
7
8 END;
9 /
Trigger created.
SQL> CREATE OR REPLACE TRIGGER preu_r_user_data BEFORE UPDATE ON user_data
2 FOR EACH ROW
3 DECLARE
4
5 lc_Row user_data%ROWTYPE;
6
7 BEGIN
8
9 lc_Row.id := :NEW.id;
10 lc_Row.col1 := :NEW.col1;
11
12 pkg_audit_user_data.p_Log
13 ( lc_Row
14 );
15
16 END;
17 /
Trigger created.
SQL> CREATE OR REPLACE TRIGGER postu_s_user_data AFTER UPDATE ON user_data
2 DECLARE
3
4 BEGIN
5
6 pkg_audit_user_data.p_Write;
7
8 END;
9 /
Trigger created.
SQL>
SQL>
SQL> insert
2 into
3 user_data
4 select
5 rownum,
6 dbms_random.string('u',20)
7 from
8 dual
9 connect by
10 level <=10
11 /
10 rows created.
SQL> select * from user_data
2 /
ID COL1
1 GVZHKXSSJZHUSLLIDQTO
2 QVNXLTGJXFUDUHGYKANI
3 GTVHDCJAXLJFVTFSPFQI
4 CNVEGOTDLZQJJPVUXWYJ
5 FPOTZAWKMWHNOJMMIOKP
6 BZKHAFATQDBUVFBCOSPT
7 LAQAIDVREFJZWIQFUPMP
8 DXFICIPCBCFTPAPKDGZF
9 KKSMMRAQUORRPUBNJFCK
10 GBLTFZJAOPKFZFCQPGYW
10 rows selected.
SQL> select * from user_data_hist
2 /
no rows selected
SQL>
SQL> MERGE
2 INTO
3 user_data a
4 USING
5 ( SELECT
6 rownum + 8 id,
7 dbms_random.string('u',20) col1
8 FROM
9 dual
10 CONNECT BY
11 level <= 10
12 ) b
13 ON (a.id = b.id)
14 WHEN MATCHED THEN
15 UPDATE SET a.col1 = b.col1
16 WHEN NOT MATCHED THEN
17 INSERT(a.id,a.col1)
18 VALUES (b.id,b.col1)
19 /
10 rows merged.
SQL> select * from user_data_hist
2 /
ID COL1 TMSP
9 XGURXHHZGSUKILYQKBNB 05-AUG-11 10.04.15.577989
10 HLVUTUIFBAKGMXBDJTSL 05-AUG-11 10.04.15.578090
SQL> select * from v$version
2 /
BANNER
Oracle Database 10g Enterprise Edition Release 10.2.0.4.0 - 64bi
PL/SQL Release 10.2.0.4.0 - Production
CORE 10.2.0.4.0 Production
TNS for Linux: Version 10.2.0.4.0 - Production
NLSRTL Version 10.2.0.4.0 - ProductionHTH
David -
Add dynamic buttons to each report row and bind the row data to buttons specific to that row
I have a page with a form at the top and a report at the bottom which is tied to a table. I need to add a pair of buttons for each row of the table and add dynamic actions for these buttons. I need to submit the data corresponding to the row to a REST service and refresh the table/row. I know it is possible to add ajax call and refresh the table after the response is received. But I am not sure how I can dynamically include a pair of buttons on each row and have the access the data corresponding to the record when a particular button is clicked. I was not able to find such a feature using the help page. Can you please let me know if this is possible. Thanks in advance.
Below is the representation of how I need the page to look like.
Col 1
Col 2
Col 3
data 1
data 21
data 31
Button 1
Button 2
data 2
data 22
data 32
Button 1
Button 2
data 3
data 23
data 33
Button 1
Button 2
data 4
data 24
data 34
Button 1
Button 2
I should be able to access data 1, data21, data 31 from button 11 and button21 etc.select data1,
data2,
data3,
null button1,
null button2,
rowid r_id
from .....
If you edit the column for button1 and navigate to the Column Formatting region you can create your button here, several different ways. You will need to play around with how you prefer. See below:
<input class="button1" type="button" id="#R_ID#" value="label for the button" /> or <button id="#R_ID#">label for button</button> or you could use <a> and apply css of your theme.
You also create both buttons in a single column. From here you can reference any of the columns from your select by using the #COLUNMNAME# format (as I did with R_ID).
You will next need to something similar for all columns you want to be able to grab when you click the buttons. See below:
--- This would be for data1 column
<span id="D1-#R_ID#">#DATA1#</span>
You would do these for each column that you want access to when button clicked as mentioned above. You could bypass this step and use jquery to traverse the DOM using .closest() and .find() and give each column a distinct class. Is is your preference.
You would then create a dynamic action triggered when .button1 is clicked. Now that you have that object you can get the id of the triggering object which would be your r_id and from there you can access all your individual data points(all this done in a javascript in the dynamic action) and assign to hidden items on your page. The next step of that dynamic action you could execute pl/sql and pass those in page items.
Hope that all makes sense.
David
Message was edited by: DLittle
Just as a bit of clarification: the r_id column does not have to be rowid, but it does need to be unique for each row return. You could use your primary key or rownum. Whatever works for your scenario. -
I have a dynamic table that calculates the sum of all rows, no issue. I'm struggling with pulling out a subtotal though. I would like to have a check box in each row that flags those rows and gives the sum of their total. Any help would be greatly appreciated.
Here's something I threw together rq. The script is in the change event for the checkbox in the table. (Of course, you'll have to modify it to suit the names of your fields.)
var rows = xfa.resolveNodes("tblAmounts.Row1[*]");
var subtotal=0;
for (i=0; i<rows.length; i++) if (rows.item(i).cbAdd.rawValue == 1) subtotal = subtotal + rows.item(i).nfAmount.rawValue;
nfSubtotal.rawVlaue=subtotal; -
How do I save rows results and add them up for each row.
I have the following select statement that saves the results in result1, result 2 and result3 variables.. How do I include a total (result1 + result2 + result3) for each row?
select TRUNC(AVG( SUM ( (DECODE (cc.request_wflow_status, 'Estimator Notified'
,cc.end_date,NULL) - DECODE (cc.request_wflow_status,
'Estimator Notified',cc.start_date,NULL)))),1) result1
,TRUNC(AVG( SUM ( (DECODE (cc.request_wflow_status, 'Estimator Accepted'
,cc.end_date,NULL) -
DECODE (cc.request_wflow_status,'Estimator Accepted',cc.start_date,NULL)))),1) result2
,TRUNC(AVG( SUM ( (DECODE (cc.request_wflow_status,
'Install Complete',cc.end_date,NULL) -
DECODE (cc.request_wflow_status,'Install Complete',cc.start_date,NULL)))),1) result3
FROM cc_request_status cc
GROUP BY cc.request_id
Thanks,select result1,result2,result3,nvl(result1,0) + nvl(result2,0) + nvl(result3,0) Total from ( ...your query...);
-
trying to create an Aperture book and have sorted 900 photos in manual order...the book application resorts my photos and i have to search 900 photos each time to place a photo. how can i set the sorting to stay in my manual setup?
1. If you manually sort Images in a Project, and then create a new Book w. "Include selected Images" checked, your manual sort order will be preserved.
2. As a workaround for when you have already created a Book, batch rename your Images with a leading index or counter after setting the order manually. You'll have to create a Naming Preset, e.g.: "{Index}_{Version Name}". The renaming should go in the order that you selected the Images, so be sure to select the first one first, and then use "Edit→Select all" to extend the selection to all Images. Once the Images are in your Book, a sort on Version Name should be the same as the manual sort you created.
3. At the risk of stating the obvious, you could create the Book, import the Images, and then sort them. The Book -- it is, to Aperture, just a special kind of Album -- retains its own manual sort order. I recommend using Albums this way. Use Projects to cull and develop Images, and use Albums for output. Sorting for a Book (or any output) is better done in the container devoted to that output (and not in the Project, which is more general).
Message was edited by: Kirby Krieger -- added #3. -
In the app "Numbers" I want to paste in many rows in to columns. And each row separated. Now it's getting all in one column when I try to paste.
Fact is that I have a 4 pages list with about 50 rows of single words and I want all that in columnsHey joshuafromisr,
If you resintall iTunes, it should fix the issue. The following document will go over how to remove iTunes fully and then reinstall. Depending on what version of Windows you're running you'll either follow the directions here:
Removing and Reinstalling iTunes, QuickTime, and other software components for Windows XP
http://support.apple.com/kb/HT1925
or here:
Removing and reinstalling iTunes, QuickTime, and other software components for Windows Vista or Windows 7
http://support.apple.com/kb/HT1923
Best,
David -
:new., lob columns and after insert ... for each row trigger
I'm having this little problem with
my update/insert triggers and lob columns.
I have this little trigger:
CREATE OR REPLACE TRIGGER BLOB_DATA_INSERT
AFTER INSERT ON BLOB_DATA
FOR EACH ROW
BEGIN
INSERT INTO BLOB_DATA_CHANGE VALUES(CHANGE_SEQ.NEXTVAL, 'I', :NEW.ID,:NEW.DATA);
END;
Which works except when the DATA column
is a BLOB or CLOB (There is no data in :NEW.DATA. I even tried some of the DBMS_LOB package procedures)
The one thing that is different for the lob columns is that the application updating
the data is using a bind statement with returning, like:
INSERT INTO BLOB_DATA
VALUES(:ID,EMPTY_BLOB())
RETURNING DATA INTO :DATA;
COMMIT;
Thanks for any helpI'm having this little problem with
my update/insert triggers and lob columns.
I have this little trigger:
CREATE OR REPLACE TRIGGER BLOB_DATA_INSERT
AFTER INSERT ON BLOB_DATA
FOR EACH ROW
BEGIN
INSERT INTO BLOB_DATA_CHANGE VALUES(CHANGE_SEQ.NEXTVAL, 'I', :NEW.ID,:NEW.DATA);
END;
Which works except when the DATA column
is a BLOB or CLOB (There is no data in :NEW.DATA. I even tried some of the DBMS_LOB package procedures)
The one thing that is different for the lob columns is that the application updating
the data is using a bind statement with returning, like:
INSERT INTO BLOB_DATA
VALUES(:ID,EMPTY_BLOB())
RETURNING DATA INTO :DATA;
COMMIT;
Thanks for any help
Maybe you are looking for
-
I need to upgrade the silverlight version that is deployed with the SCCM 2012 client installation
We have SCCM 2012 SP1 installed and I need to upgrade the version of silverlight that get installed on the client deployement. The version that is automaticly deployed is 5.1.10411.0 and this one is not compatible with Firefox. When you click on App
-
Dear SAP Gurus, I have a requirement where delivery should happen only within the validity period. I am trying to map the same through Scheduling agreement, where i can enter the From and To date. When i am entering the schedule line manually which f
-
Under the music tab... songs not displaying correctly
so i just got a new 60 gb ipod video... got home loaded the necessary software and loaded the songs to the ipod. videos, pictures, and songs are all diplaying. with one small problem. when you go into the music tab, instead of seperating it into the
-
The Best Photo Organizing Software
Although this is probably in the wrong forum category, I couldn't find anywhere else to go so here's my problem: I would imagine this has been posted many times, so if anyone knows a link to a previous forum that would be great as well. Otherwise, I
-
HELP! Trying to add two different movements to edit
Lets say I have an effect I have done. Like, bringing a character in from the bottom of the screen. It takes me 20 or so frames to do it. I have marked key frames along the way so that the footage will follow the line I have laid out. Now, how the he