Parttion table export problem
Hi,
I am running an export for a user. Export is taken using the users credentials, I get this error during the export.
EXP-00011: <table name> does not exist
but I when checked in the database the table exists
the table(master_fact) which I am trying to export is a partitioned table.
all the other tables (non-partitioned) ones are exporting properly.
DIRECT=Y RECORDLENGTH=65535 CONSTRAINTS=Y INDEXES=Y ROWS=Y BUFFER=102400000
FILE=par_1_exp.dmp
LOG=862_master_fact.log
TABLES=(master_fact )
Is there anything to do with that partition and also when I tried taking export of
partition it is working properly.
Please help me in resolving the issue
Oracle Db version and export version are the same 9.2.0.8.0
Similar Messages
-
display:table.... /display:table export problem
HI,
when I wrote in the jsp:
<display:table pagesize="30" class="isis" name="contacts" export="true">
<display:column property="firstName"/>
<display:column property="lastName"/>
<display:column property="title"/>
<display:column property="gender"/>
<display:column property="phoneNumber"/>
<display:column property="email"/>
<display:column property="country.phonePrefix" title="Country"/>
<display:column property="birthdate"/>
</display:table>
There are Export | CSV | EXCL |xmL for users to export the table.
But when I click the CSV, there are nothing to do,
How can I do that?
thanks,Hi jcatino ,
Thank you very much.
But I also have another problem:
<display:table pagesize="20" class="isis" name="subscribers" id="subscriber" export="true" requestURI="manageContact.do">
<display:column sortable="true" >
<c:out value="${subscriber.UID}"/>
</display:column>
<logic:iterate id="group" name="groups" indexId="index2">
<logic:iterate id="extendedProfile" name="subscriber" property="extendedProfiles">
<logic:equal name="extendedProfile" property="source.id" value="${group.id}">
<display:column title="${group.name}">${extendedProfile.actualValue}</display:column>
</logic:equal>
</logic:iterate>
</logic:iterate>
</display:table>
The result that I got from clicking the "CSV" is that:
"<a href=\"manageContact.do?contactId=10&task=do_edit\">
10
</a>",VW Jetta owners,2005-3-2 00:00:00.000000000,aa,2233,2006-10-30 00:00:00.000000000,[email protected],12345
"<a href=\"manageContact.do?contactId=11&task=do_edit\">
11
</a>",VW Jetta owners,2004-2-28 00:00:00.000000000,aa,aacc,2006-11-30 00:00:00.000000000,[email protected],12345
How can I delete the <a..............> </a> ?
I have read information from the website : http://displaytag.sourceforge.net/configuration.html
However, I have no idea to solve it?
thanks you
happybaobao -
Hi Expert , crystal report export problem. system not responding
Hi,
crystal report export problem. system not responding.
Thanks
Rajkumar GuptaDear Raj,
Please try this
Try
Dim oSubReport As CrystalDecisions.CrystalReports.Engine.SubreportObject
Dim rptSubReportDoc As CrystalDecisions.CrystalReports.Engine.ReportDocument
Dim rptView As New CrystalDecisions.Windows.Forms.CrystalReportViewer
Dim rptPath As String = System.Windows.Forms.Application.StartupPath & "\" & rptName
Dim rptDoc As New CrystalDecisions.CrystalReports.Engine.ReportDocument
rptDoc.Load(rptPath)
rptView.ShowExportButton = True
rptView.ReportSource = rptDoc
For Each oMainReportTable As CrystalDecisions.CrystalReports.Engine.Table In rptDoc.Database.Tables
oMainReportTable.Location = System.Windows.Forms.Application.StartupPath & "\" & SourceXML
Next
For Each rptSection As CrystalDecisions.CrystalReports.Engine.Section In rptDoc.ReportDefinition.Sections
For Each rptObject As CrystalDecisions.CrystalReports.Engine.ReportObject In rptSection.ReportObjects
If rptObject.Kind = CrystalDecisions.Shared.ReportObjectKind.SubreportObject Then
oSubReport = rptObject
rptSubReportDoc = oSubReport.OpenSubreport(oSubReport.SubreportName)
For Each oSubTable As CrystalDecisions.CrystalReports.Engine.Table In rptSubReportDoc.Database.Tables
oSubTable.Location = System.Windows.Forms.Application.StartupPath & "\" & SourceXML
Next
End If
Next
Next
'Setting Paper
Dim rawKind As Integer = 0
Dim printSet As New System.Drawing.Printing.PrinterSettings
For i As Integer = 0 To printSet.PaperSizes.Count - 1
If printSet.PaperSizes.Item(i).PaperName.ToUpper = PaperName.ToUpper Then
rawKind = CInt(printSet.PaperSizes.Item(i).RawKind)
Exit For
End If
Next
Dim MyTest As New SaveFileDialog
rptDoc.PrintOptions.PaperSize = CType(rawKind, CrystalDecisions.Shared.PaperSize)
rptDoc.ExportToStream(ExportFormatType.Excel)
'rptDoc.SaveAs("C:\TBKING.xls", True)
'''How to export the report
Try
Dim CrExportOptions As ExportOptions
Dim CrDiskFileDestinationOptions As New _
DiskFileDestinationOptions()
Dim rename As String
rename = rptName.Replace(".rpt", "")
Dim CrFormatTypeOptions As New ExcelFormatOptions
CrDiskFileDestinationOptions.DiskFileName = _
"c:\Report\" & rename & "_Export_File.xls"
CrExportOptions = rptDoc.ExportOptions
With CrExportOptions
.ExportDestinationType = ExportDestinationType.DiskFile
.ExportFormatType = ExportFormatType.Excel
.DestinationOptions = CrDiskFileDestinationOptions
.FormatOptions = CrFormatTypeOptions
End With
rptDoc.Export()
Catch ex As Exception
MsgBox(ex.ToString)
End Try
'' end by kevin shah
rptView.Show()
rptView.ShowExportButton = True
Dim oFrm As New System.Windows.Forms.Form
rptView.DisplayGroupTree = True
rptView.Dock = System.Windows.Forms.DockStyle.Fill
rptView.Location = New System.Drawing.Point(0, 0)
oFrm.AutoScaleBaseSize = New System.Drawing.Size(5, 13)
oFrm.Controls.Add(rptView)
oFrm.Name = "Report Viewer"
oFrm.Text = "Report Viewer11"
oFrm.ResumeLayout(True)
oFrm.WindowState = System.Windows.Forms.FormWindowState.Maximized
oFrm.TopMost = True
oFrm.ShowDialog()
Catch ex As Exception
objMain.objApplication.MessageBox(ex.Message)
End Try
By pressing this button XLS file be generated on C:\report\
Hope this will resolved the issue
Thanks
Kevin -
Slow split table export (R3load and WHERE clause)
For our split table exports, we used custom coded WHERE clauses. (Basically adding additional columns to the R3ta default column to take advantage of existing indexes).
The results have been good so far. Full tablescans have been eliminated and export times have gone down, in some cases, tables export times have improved by 50%.
However, our biggest table, CE1OC01 (120 GB), continues to be a bottleneck. Initially, after using the new WHERE clause, it looked like performance gains were dramatic, with export times for the first 5 packages dropping from 25-30 hours down to 1 1/2 hours.
However, after 2 hours, the remaining CE1OC01 split packages have shown no improvement. This is very odd because we are trying to determine why part of the table exports very fast, but other parts are running very slow.
Before the custom WHERE clauses, the export server had run into issues with SORTHEAP being exhausted, so we thought that might be the culprit. But that does not seem to be an issue now, since the improved WHERE clauses have reduced or eliminated excessive sorting.
I checked the access path of all the CE1OC01 packages, through EXPLAIN, and they all access the same index to return results. The execution time in EXPLAIN returns similar times for each of the packages:
CE1OC01-11: select * from CE1OC01 WHERE MANDT='212'
AND ("BELNR" > '0124727994') AND ("BELNR" <= '0131810250')
CE1OC01-19: select * from CE1OC01 WHERE MANDT='212'
AND ("BELNR" > '0181387534') AND ("BELNR" <= '0188469413')
0 SELECT STATEMENT ( Estimated Costs = 8.448E+06 [timerons] )
|
--- 1 RETURN
|
--- 2 FETCH CE1OC01
|
------ 3 IXSCAN CE1OC01~4 #key columns: 2
query execution time [millisec] | 333
uow elapsed time [microsec] | 429,907
total user CPU time [microsec] | 0
total system cpu time [microsec] | 0
Both queries utilize an index that has fields MANDT and BELNR. However, during R3load, CE1OC01-19 finishes in an hour and a half, whereas CE1OC01-11 can take 25-30 hours.
I am wondering if there is anything else to check on the DB2 access path side of things or if I need to start digging deeper into other aggregate load/infrastructure issues. Other tables don't seem to exhibit this behavior. There is some discrepancy between other tables' run times (for example, 2-4 hours), but those are not as dramatic as this particular table.
Another idea to test is to try and export only 5 parts of the table at a time, perhaps there is a throughput or logical limitation when all 20 of the exports are running at the same time. Or create a single column index on BELNR (default R3ta column) and see if that shows any improvement.
Anyone have any ideas on why some of the table moves fast but the rest of it moves slow?
We also notice that the "fast" parts of the table are at the very end of the table. We are wondering if perhaps the index is less fragmented in that range, a REORG or recreation of the index may do this table some good. We were hoping to squeeze as many improvements out of our export process as possible before running a full REORG on the database. This particular index (there are 5 indexes on this table) has a Cluster Ratio of 54%, so, perhaps for purposes of the export, it may make sense to REORG the table and cluster it around this particular index. By contrast, the primary key index has a Cluster Ratio of 86%.
Here is the output from our current run. The "slow" parts of the table have not completed, but they average a throughput of 0.18 MB/min, versus the "fast" parts, which average 5 MB/min, a pretty dramatic difference.
package time start date end date size MB MB/min
CE1OC01-16 10:20:37 2008-11-25 20:47 2008-11-26 07:08 417.62 0.67
CE1OC01-18 1:26:58 2008-11-25 20:47 2008-11-25 22:14 429.41 4.94
CE1OC01-17 1:26:04 2008-11-25 20:47 2008-11-25 22:13 416.38 4.84
CE1OC01-19 1:24:46 2008-11-25 20:47 2008-11-25 22:12 437.98 5.17
CE1OC01-20 1:20:51 2008-11-25 20:48 2008-11-25 22:09 435.87 5.39
CE1OC01-1 0:00:00 2008-11-25 20:48 0.00
CE1OC01-10 0:00:00 2008-11-25 20:48 152.25
CE1OC01-11 0:00:00 2008-11-25 20:48 143.55
CE1OC01-12 0:00:00 2008-11-25 20:48 145.11
CE1OC01-13 0:00:00 2008-11-25 20:48 146.92
CE1OC01-14 0:00:00 2008-11-25 20:48 140.00
CE1OC01-15 0:00:00 2008-11-25 20:48 145.52
CE1OC01-2 0:00:00 2008-11-25 20:48 184.33
CE1OC01-3 0:00:00 2008-11-25 20:48 183.34
CE1OC01-4 0:00:00 2008-11-25 20:48 158.62
CE1OC01-5 0:00:00 2008-11-25 20:48 157.09
CE1OC01-6 0:00:00 2008-11-25 20:48 150.41
CE1OC01-7 0:00:00 2008-11-25 20:48 175.29
CE1OC01-8 0:00:00 2008-11-25 20:48 150.55
CE1OC01-9 0:00:00 2008-11-25 20:48 154.84Hi all, thanks for the quick and extremely helpful answers.
Beck,
Thanks for the health check. We are exporting the entire table in parallel, so all the exports begin at the same time. Regarding the SORTHEAP, we initially thought that might be our problem, because we were running out of SORTHEAP on the source database server. Looks like for this run, and the previous run, SORTHEAP has remained available and has not overrun. That's what was so confusing, because this looked like a buffer overrun.
Ralph,
The WHERE technique you provided worked perfectly. Our export times have improved dramatically by switching to the forced full tablescan. Being always trained to eliminate full tablescans, it seems counterintuitive at first, but, given the nature of the export query, combined with the unsorted export, it now makes total sense why the tablescan works so much better.
Looks like you were right, in this case, the index adds too much additional overhead, and especially since our Cluster Ratio was terrible (in the 50% range), so the index was definitely working against us, by bouncing all over the place to pull the data out.
We're going to look at some of our other long running tables and see if this technique improves runtimes on them as well.
Thanks so much, that helped us out tremendously. We will verify the data from source to target matches up 1 for 1 by running a consistency check.
Look at the throughput difference between the previous run and the current run:
package time start date end date size MB MB/min
CE1OC01-11 40:14:47 2008-11-20 19:43 2008-11-22 11:58 437.27 0.18
CE1OC01-14 39:59:51 2008-11-20 19:43 2008-11-22 11:43 427.60 0.18
CE1OC01-12 39:58:37 2008-11-20 19:43 2008-11-22 11:42 430.66 0.18
CE1OC01-13 39:51:27 2008-11-20 19:43 2008-11-22 11:35 421.09 0.18
CE1OC01-15 39:49:50 2008-11-20 19:43 2008-11-22 11:33 426.54 0.18
CE1OC01-10 39:33:57 2008-11-20 19:43 2008-11-22 11:17 429.44 0.18
CE1OC01-8 39:27:58 2008-11-20 19:43 2008-11-22 11:11 417.62 0.18
CE1OC01-6 39:02:18 2008-11-20 19:43 2008-11-22 10:45 416.35 0.18
CE1OC01-5 38:53:09 2008-11-20 19:43 2008-11-22 10:36 413.29 0.18
CE1OC01-4 38:52:34 2008-11-20 19:43 2008-11-22 10:36 424.06 0.18
CE1OC01-9 38:48:09 2008-11-20 19:43 2008-11-22 10:31 416.89 0.18
CE1OC01-3 38:21:51 2008-11-20 19:43 2008-11-22 10:05 428.16 0.19
CE1OC01-2 36:02:27 2008-11-20 19:43 2008-11-22 07:46 409.05 0.19
CE1OC01-7 33:35:42 2008-11-20 19:43 2008-11-22 05:19 414.24 0.21
CE1OC01-16 9:33:14 2008-11-20 19:43 2008-11-21 05:16 417.62 0.73
CE1OC01-17 1:20:01 2008-11-20 19:43 2008-11-20 21:03 416.38 5.20
CE1OC01-18 1:19:29 2008-11-20 19:43 2008-11-20 21:03 429.41 5.40
CE1OC01-19 1:16:13 2008-11-20 19:44 2008-11-20 21:00 437.98 5.75
CE1OC01-20 1:14:06 2008-11-20 19:49 2008-11-20 21:03 435.87 5.88
PLPO 0:52:14 2008-11-20 19:43 2008-11-20 20:35 92.70 1.77
BCST_SR 0:05:12 2008-11-20 19:43 2008-11-20 19:48 29.39 5.65
CE1OC01-1 0:00:00 2008-11-20 19:43 0.00
558:13:06 2008-11-20 19:43 2008-11-22 11:58 8171.62
package time start date end date size MB MB/min
CE1OC01-9 9:11:58 2008-12-01 20:14 2008-12-02 05:26 1172.12 2.12
CE1OC01-5 9:11:48 2008-12-01 20:14 2008-12-02 05:25 1174.64 2.13
CE1OC01-4 9:11:32 2008-12-01 20:14 2008-12-02 05:25 1174.51 2.13
CE1OC01-8 9:09:24 2008-12-01 20:14 2008-12-02 05:23 1172.49 2.13
CE1OC01-1 9:05:55 2008-12-01 20:14 2008-12-02 05:20 1188.43 2.18
CE1OC01-2 9:00:47 2008-12-01 20:14 2008-12-02 05:14 1184.52 2.19
CE1OC01-7 8:54:06 2008-12-01 20:14 2008-12-02 05:08 1173.23 2.20
CE1OC01-3 8:52:22 2008-12-01 20:14 2008-12-02 05:06 1179.91 2.22
CE1OC01-10 8:45:09 2008-12-01 20:14 2008-12-02 04:59 1171.90 2.23
CE1OC01-6 8:28:10 2008-12-01 20:14 2008-12-02 04:42 1172.46 2.31
PLPO 0:25:16 2008-12-01 20:14 2008-12-01 20:39 92.70 3.67
90:16:27 2008-12-01 20:14 2008-12-02 05:26 11856.91 -
[SQL SERVER 2000] Generic table exporter
Hello every body.
First of all sorry for my bad english but I'm french ;-)
My internship consits into making a generic table exporter (with a table list). Export into csv files.
I have tried 2 solutions :
1 - Create a DTS with a Dynamic Properties Task. The problem is I can easily change the destination file but when I change the table source I don't know how to remap the transformations between source and destination (do you see what I mean ?). Any idea ?
2 - Use the bcp command. Very simple but how to do when tables contains the separator caracter ? for example : If a table line is "toto" | "I am , very happy" --> the csv file will look like this : toto, I am , very happy --> problem to get back the data ( to much comma ).
Does someone has a solution ?
Last point is how to export the table structure ? For the moment, using the table structure, I generate an sql query which creates a table (I write this query in a file). Isn't there any "cleaner" solution ?
Thanks in advance and have a nice day allAnswers,
1. Use ActiveX script to transform. Refer
http://technet.microsoft.com/en-us/library/aa933459(v=sql.80).aspx
2. Replace the pipe delimiter first with comma if it is a single column and use bcp command. Refer
http://technet.microsoft.com/en-us/library/aa174646(v=sql.80).aspx
3. Regarding generating script refer
http://stackoverflow.com/questions/4058977/exporting-tables-and-indexes-from-one-sql-server-2000-database-to-another
Regards, RSingh -
Hi
I'm fairly new to CS4 and have picked up an audio export problem, more than likely with a setting of sorts. For some reason when I export to MPEG2 -DVD via the Media Encoder, the video portion is fine as well as any audio that has been placed and edited on tracks audio 1 and 2 in PP. However, if there is additional audio placed on tracks 3 or 4 for example (say a music MP3 overlay) then this does not come out ... only audio placed on tracks 1 and 2 in PP come out. I have checked the obvious like making sure the tracks aren't muted etc but to no avail. Also tried changing from PCM audio to MPEG ... Stereo to 5.1 mixdown ... nothing doing.
Any help would be greatly appreciated.
Thanks and regards
Videoman 1
SAThanks Harm ... got SnagIt and with a few helpful YouTube tutorials ... am in business. Hunt and John, also thanks ... the other method equally as good. Now, I have attempted to attach my screen shot which should display my CS4 timeline and associated audio monitor windows above. By the way, this was for a wedding and as indicated, you will notice an MP3 lying in audio track 3 which as mentioned is my problem as it is not audible after render via Media Encoder. But remember, the moment I slip this MP3 up into audio track 1 or 2 above ... problem solved. I know there must be something small like a setting gone wrong or incorrect. As mentioned, any help would greatly appreciated.
Thanks to you all.
Regards Paul ( Vryheid, South Africa ) -
Export Problem in LR 3.5
Hi folks,
I have been working with LR 2.7 for a while now and was happy with it.
PC - Win XP - LR3.5
Photo's stored on external hard disk (K:)
The RAW captures of a Nikon 3100 I recently won were not being recognized/imported by LR 2.7 and I installed the 3.5 version to see if it was any better. I would have to buy it anyway by the time I switch to Win7. In 3.5 the Raw files were imported, the settings of 2.7 were nicely taken over but now and I cannot export the files I have worked on to the K: drive of the external hard drive as I used to do with 2.7.
I get the message 'The folder could not be found' though it is set in the Export Window and I am looking at it in Win Explorer.
The original RAW captures are on the D: drive but that was always the case with 2.7 as well... I am doing it exactly as I used to with 2.7.
I tried with a photo from an older folder of my usual camera and everything works fine there.
The problem seems to be only with the Nikon files. Now I start wondering if there is something with raw format of this Nikon camera that is putting LR in a twist...
Anybody any ideas to help me out?
Thanks and grts
Nil@ web-weaver:
Of course they go to a different folder and they also get different
titels/names once processed in LR. The originals are always RAW (NEF) and I
export them as jpg or tiff files depending on what more I will do to them.
@ ssprengel:
I get the message when I have renamed the file in the export window, set
the folder it has to go to and hit export.
In the 2.7 version I got it already when I tried to import.
As I said, this only happens with the Nikon (NEF) RAW files. With the
Olympus (ORF) RAW files I have no problem at all and had no problem in LR
2.7 either.
I cannot see anything wrong in the Export panel nor in Preferences.
Plug in is Metadata Wrangler and has been updated to the latest version >
no joy...
I tried import/export with jpg files and no problem there.
Yesss! Got it! What I had overlooked was a question mark next to the folder
in Library (sooo small). A rightclick and telling LR where it was solved
the problem. Strange, though, as it has never done this before... the main
folder and the path are still the same as the ones that work perfectly with
Olympus exports... just a different subfolder named Nikon and another named
Test... Also strange that if I started off with jpg, it was no problem
either...
Anyway, thanks a lot for shaking my senses for me
Nil
2011/12/10 web-weaver <[email protected]>
Re: Export Problem in LR 3.5 created by web-weaver<http://forums.adobe.com/people/web-weaver>in
Photoshop Lightroom - View the full discussion<http://forums.adobe.com/message/4075999#4075999> -
Base Table for problem code in Cs_incidents_all_b
hi
in cs_incidents_all_b we have problem_code. the does not contain any data ... we have any tl table for problem code i have cssr_prob_code_mapping_detail but if i query this
SELECT dra.repair_number,
items.description item_desc,
prob.problem_code,
fndl.meaning flow_status_name,
inc.summary,
nvl(cp.instance_number,'Not availble') ib_instance_number
FROM csd_repairs dra,
csd_repair_types_tl drtt,
cs_incidents_all_b sr,
csi_item_instances cp,
fnd_lookups fndl,
csd_flow_statuses_b fsb,
mtl_system_items_kfv items,
mtl_units_of_measure_tl uom,
jtf_rs_resource_extns_tl rstl,
jtf_rs_groups_tl rgtl,
fnd_lookups plkup,
cs_incidents_all_tl inc,
cs_sr_prob_code_mapping_detail prob,
cs_incident_types_b ty
WHERE dra.repair_type_id = drtt.repair_type_id
AND drtt.language = userenv('LANG')
AND dra.repair_mode = 'WIP'
AND dra.incident_id = sr.incident_id
AND dra.CUSTOMER_PRODUCT_ID = cp.INSTANCE_ID (+)
AND dra.flow_status_id = fsb.flow_status_id
AND fsb.flow_status_code = fndl.lookup_code
AND fndl.lookup_type = 'CSD_REPAIR_FLOW_STATUS'
AND dra.inventory_item_id = items.inventory_item_id
AND dra.unit_of_measure = uom.uom_code
AND uom.language = userenv('LANG')
AND dra.resource_id = rstl.resource_id (+)
AND rstl.category (+) = 'EMPLOYEE'
AND rstl.language (+) = userenv('LANG')
AND dra.owning_organization_id = rgtl.group_id (+)
AND rgtl.language (+) = userenv('LANG')
AND dra.ro_priority_code = plkup.lookup_code(+)
AND plkup.lookup_type(+) = 'CSD_RO_PRIORITY'
AND items.organization_id = cs_std.get_item_valdn_orgzn_id
AND inc.incident_id =dra.incident_id
and ty.incident_type_id=sr.incident_type_id
and prob.incident_type_id=ty.incident_type_id
AND fndl.meaning in('Open')
order by dra.repair_numbereach diffrent problem codes for same repair number here i am want records relevant to Depot RepairIn 11.5.9, the problem and resolution codes are stored in FND_LOOKUP_VALUES table with lookup type as 'REQUEST_PROBLEM_CODE' and 'REQUEST_RESOLUTION_CODE'. I'm hoping you could still use these tables to find problem codes, even if you were on 11.5.10 or R12.
Join would be something like:
WHERE fnd_lookup_values.lookup_type = 'REQUEST_PROBLEM_CODE'
AND fnd_lookup_values.problem_code = cs_incidents_all_b.problem_code
Regarding restricting the query for Depot Repair service requests, you need to restrict by the the incident_type_id for this type of SRs (like id for Depot incident type is 10003 for us).
HTH
Alka -
Table update problem in tabstrip control
Hi experts.
i have one tabstrip control having 4 tabs.
in those tabs i am updating one table say ztable(database table).
my problem is this is happening in pai of every tab.
but what ever i am updateing table those changes will reflect in other tabs.
where i have to write code like pbo or what i should do.
what ever i am updateing in tab1 i have to see in tab2.what ever i have update i should see in tab3.
but it is not showing the updates when we will go for other tabs.
ThanksHi,
Please check the following things :
1 . The Fucntion code type for each tab is blank
2 . Same subscreen area is assigned to each tab
3 . Corresponding subscreen is dynamically incorporated into the subscreen area CALL SUBSCREEN in the flow logic
If you are not doing this, then you are scolling in SAP GUI, not in your program. In this case the values entered in TAB1 won't get reflected in TAB2.
If this is the case in your program, fix the above three points. then it will work. -
SQL+-MULTI TABLE QUERY PROBLEM
HAI ALL,
ANY SUGGESTION PLEASE?
SUB: SQL+-MULTI TABLE QUERY PROBLEM
SQL+ QUERY GIVEN:
SELECT PATIENT_NUM, PATIENT_NAME, HMTLY_TEST_NAME, HMTLY_RBC_VALUE,
HMTLY_RBC_NORMAL_VALUE, DLC_TEST_NAME, DLC_POLYMORPHS_VALUE,
DLC_POLYMORPHS_NORMAL_VALUE FROM PATIENTS_MASTER1, HAEMATOLOGY1,
DIFFERENTIAL_LEUCOCYTE_COUNT1
WHERE PATIENT_NUM = HMTLY_PATIENT_NUM AND PATIENT_NUM = DLC_PATIENT_NUM AND PATIENT_NUM
= &PATIENT_NUM;
RESULT GOT:
&PATIENT_NUM =1
no rows selected
&PATIENT_NUM=2
no rows selected
&PATIENT_NUM=3
PATIENT_NUM 3
PATIENT_NAME KKKK
HMTLY_TEST_NAME HAEMATOLOGY
HMTLY_RBC_VALUE 4
HMTLY_RBC_NORMAL 4.6-6.0
DLC_TEST_NAME DIFFERENTIAL LEUCOCYTE COUNT
DLC_POLYMORPHS_VALUE 60
DLC_POLYMORPHS_NORMAL_VALUE 40-65
ACTUAL WILL BE:
&PATIENT_NUM=1
PATIENT_NUM 1
PATIENT_NAME BBBB
HMTLY_TEST_NAME HAEMATOLOGY
HMTLY_RBC_VALUE 5
HMTLY_RBC_NORMAL 4.6-6.0
&PATIENT_NUM=2
PATIENT_NUM 2
PATIENT_NAME GGGG
DLC_TEST_NAME DIFFERENTIAL LEUCOCYTE COUNT
DLC_POLYMORPHS_VALUE 42
DLC_POLYMORPHS_NORMAL_VALUE 40-65
&PATIENT_NUM=3
PATIENT_NUM 3
PATIENT_NAME KKKK
HMTLY_TEST_NAME HAEMATOLOGY
HMTLY_RBC_VALUE 4
HMTLY_RBC_NORMAL 4.6-6.0
DLC_TEST_NAME DIFFERENTIAL LEUCOCYTE COUNT
DLC_POLYMORPHS_VALUE 60
DLC_POLYMORPHS_NORMAL_VALUE 40-65
4 TABLES FOR CLINICAL LAB FOR INPUT DATA AND GET REPORT ONLY FOR TESTS MADE FOR PARTICULAR
PATIENT.
TABLE1:PATIENTS_MASTER1
COLUMNS:PATIENT_NUM, PATIENT_NAME,
VALUES:
PATIENT_NUM
1
2
3
4
PATIENT_NAME
BBBB
GGGG
KKKK
PPPP
TABLE2:TESTS_MASTER1
COLUMNS:TEST_NUM, TEST_NAME
VALUES:
TEST_NUM
1
2
TEST_NAME
HAEMATOLOGY
DIFFERENTIAL LEUCOCYTE COUNT
TABLE3:HAEMATOLOGY1
COLUMNS:
HMTLY_NUM,HMTLY_PATIENT_NUM,HMTLY_TEST_NAME,HMTLY_RBC_VALUE,HMTLY_RBC_NORMAL_VALUE
VALUES:
HMTLY_NUM
1
2
HMTLY_PATIENT_NUM
1
3
MTLY_TEST_NAME
HAEMATOLOGY
HAEMATOLOGY
HMTLY_RBC_VALUE
5
4
HMTLY_RBC_NORMAL_VALUE
4.6-6.0
4.6-6.0
TABLE4:DIFFERENTIAL_LEUCOCYTE_COUNT1
COLUMNS:DLC_NUM,DLC_PATIENT_NUM,DLC_TEST_NAME,DLC_POLYMORPHS_VALUE,DLC_POLYMORPHS_
NORMAL_VALUE,
VALUES:
DLC_NUM
1
2
DLC_PATIENT_NUM
2
3
DLC_TEST_NAME
DIFFERENTIAL LEUCOCYTE COUNT
DIFFERENTIAL LEUCOCYTE COUNT
DLC_POLYMORPHS_VALUE
42
60
DLC_POLYMORPHS_NORMAL_VALUE
40-65
40-65
THANKS
RCS
E-MAIL:[email protected]
--------I think you want an OUTER JOIN
SELECT PATIENT_NUM, PATIENT_NAME, HMTLY_TEST_NAME, HMTLY_RBC_VALUE,
HMTLY_RBC_NORMAL_VALUE, DLC_TEST_NAME, DLC_POLYMORPHS_VALUE,
DLC_POLYMORPHS_NORMAL_VALUE
FROM PATIENTS_MASTER1, HAEMATOLOGY1, DIFFERENTIAL_LEUCOCYTE_COUNT1
WHERE PATIENT_NUM = HMTLY_PATIENT_NUM (+)
AND PATIENT_NUM = DLC_PATIENT_NUM (+)
AND PATIENT_NUM = &PATIENT_NUM;Edited by: shoblock on Nov 5, 2008 12:17 PM
outer join marks became stupid emoticons or something. attempting to fix -
During the Unicode conversion , Cluster table export taken too much time ap
Dear All
during the Unicode conversion , Cluster table export taken too much time approximately 24 hours of 6 tables , could you please advise , how can we reduse the time
thanks
JainnedraHello,
Use latest R3load from market place.
also refer note
Note 1019362 - Very long run times during SPUMG scans
Regards,
Nitin Salunkhe -
Export problems- it's renaming my MOVs to strange things and they're huge!
Hey guys,
strange fcp export problem. I have a 10 minute anim compressed fcp project, 1920x1080, 30fps on a top-of-the-range macbook pro with external terabyte HD. I did a test export a couple of days ago, because our deadline is really soon, and that all worked fine - a 10 minute quicktime, 100% anim codec, was 26GB. That was fine. I did another test clip, just a spinning bitmap for 10 mins, from AfterEffects with similar, succesful, results and file sizes.
Now, however, when I export, it comes out as 60GB, and FCP changes the filename to something which is the first half of the filename I chose, and a strange number for the second half which is different each time I try it. Something like "filena#CF342.mov", for example. This file seems to have video in it OK, but it's 3 times the size, and the data rate is much higher, too high to play back.
The only changes I've made since my successful test are replacing many of the low res previews with hi res versions, and adding 1 minute to the end. And I changed my QMaster setup, not sure if that's anything to do with it.
Anyone got any ideas? I'm getting pretty worried.....
thanks in advance....Thanks for the quick reply! No, don't think that's the problem. The new render doesn't have an alpha channel - it's Animation 100% Millions (not Millions+), and I don't think that would add more than 25% to the file size anyway. It also wouldn't account for the strange renaming of filenames. And the increase in length is only 10%. The file size has tripled!
The final delivery format was going to be animation, 1920x280, because it's going to projected on a 30m wide cinema screen and live TV tomorrow night but our playback solution (while it handles the test files ok) can't handle the data rate of these strange new files: it's 800mb/s, which is pretty ridiculous.
The renaming of the files suggests (to me at least) some kind of bug/error, but it doesn't report any kinda error in either fcp or quicktime.
Plan B is to recompress the strange big file using h264 which gives an ok quality and is 10% of the size. Reason I'd like to do it animation is I need to do several versions with different gamma, so I need to do further processing on the rendered file after it's dumped from FCP.
Any thoughts? I'd like to get to the bottom of this.
one thought - I didn't tick "recompress all files" and some of the source DOES have alpha - would it be taking the alpha channel even though I didn't ask for it? -
Export Problem Solved (new problem)
I had posted earlier about export problem it was a silly mistake as i had kept hide extention on dats y it happened
Now problem is when i export the exported video has some pixcels and it hangs in between dunno wats wrongWhat's a pixcel?
Pixel?
All video has pixels.
Seriously, your question is completely incomprehensible. -
많은 개수의 TABLE을 한번에 TABLE 별로 EXPORT받는 방법
제품 : ORACLE SERVER
작성날짜 : 2002-04-12
많은 개수의 TABLE을 한번에 TABLE 별로 EXPORT받는 방법
=====================================================
Purpose
export를 table 별로 받아야 하는 데, export를 받아야 할 table이 매우
많은 경우 tables option에 모두 적는 것이 불가능한 경우가 있다.
이러한 경우에 대해 보다 쉽게 작업할 수 있는 방법을 알아보자.
Explanation
1. sqlplus scott/tiger로 login
SQL> set heading off
SQL> set pagesize 5000 (user가 소유한 table의 갯수이상)
SQL> spool scott.out
SQL> select tname from tab;
SQL> exit
2. 위와 같이하면 모든 scott user의 table들이 scott.out에 저장
$ vi scott.out
SQL> select tname from tab;
BONUS
DEPT
DUMMY
EMP
SALGRADE
SQL> exit
vi editor로 불필요한 처음과 마지막 두라인 삭제후 table 이름뒤에
있는 null문자를 제거 한다.
< null문자 제거 및 export 화일을 만드는 사전 작업 >
화일을 open 한 후
1) :g/ /s///g <--- table name뒤의 null문자 제거
2) :1
3) bonus table 뒤에 comma 를 append
4) :map @ j$. 하고 Enter <--- 다음 라인에도 2번의 작업을 하기 위한 macro
5) Shift+2 (계속 누르고 있음)<--- 다음 라인의 마지막에 comma 추가
6) 제일 마지막 라인은 comma 불필요
위의 out file을 100 개씩(table name이 길 경우는 그 이하로) 라인을
쪼개어 화일을 나누어 개별 화일 이름을 부여하여 저장한다.
예) 1~100은 scott1.out 101~200은 scott2.out .....과 같이 나누고
화일의 제일 마지막 라인의 comma를 제거
아래의 script4exp.c를 compile하여 export를 위한 shell script를
작성한다. ( 필요하다면 script내의 export option을 수정하여 compile)
compile이 끝난후
$ script4exp scott1.out scott1.sh scott tiger scott1.dmp scott1.log
$ script4exp scott2.out scott2.sh scott tiger scott2.dmp scott2.log
하게 되면 scott1.sh, scott2.sh,.....가 생기며 이를 모드를 바꿔
background job으로 수행하면 된다.
주의) 1. 작업이 끝난후 *.sh의 file size를 check 한다.
2. 가능한 큰 table은 outfile에서 빼내 따로 export한다.
====script4exp.c=================
#include <stdio.h>
#include <string.h>
#define EXPCMD "exp %s/%s buffer=52428800 file=%s log=%s tables="
main(int argc, char **argv)
FILE ifp, ofp;
char buff[256], *pt;
if (argc != 7)
printf("\nUSAGE :\n");
printf("$ script4exp infile.out, outfile.sh, username,
passwd, dmpfile.dmp, logfile.log\n\n");
exit(0);
if ((ifp = fopen(argv[1], "r")) == NULL)
printf("%s file open fail !!\n", argv[1]);
exit(0);
if ((ofp = fopen(argv[2], "w")) == NULL)
printf("%s file open fail !!\n", argv[1]);
exit(0);
fprintf(ofp, EXPCMD, argv[3], argv[4], argv[5], argv[6]);
while((fgets(buff, 80, ifp)) != NULL)
if ((pt = strchr(buff, '\n')) != NULL) *pt = NULL;
fprintf(ofp, "%s", buff);
memset(buff, 0, sizeof(buff));
fprintf(ofp, "\n");
fclose(ifp);
fclose(ifp);
} -
I have an external hard drive to which I export my encoded CS4 video files once I have finished editing. The problem is that, in doing so, space seems to be used also on my C drive. I need to find and delete these files on my C drive once my PP project files are rendered to my external drive. I have looked in temp files but they not there. Can anyone advise me on this?
That's done it S.V. Many thanks!
Lewis B.
Date: Mon, 13 Sep 2010 11:13:53 -0600
From: [email protected]
To: [email protected]
Subject: CS4 Export Problem
I'm a newbie so don't take my word as gospel buuuuttt....
If you're using Vista you might be making the same mistake as me. Go edit>preferences>media, hover the mouse over the media cache files location (i.e. C:\yadayada) and TYPE that into the bar at the top of the my computer. Don't click through and try and find it, TYPE it like you would an internet address in a web browser.
Silly Vista.
>
Maybe you are looking for
-
New to the World Mac and I need help with the internet?
Hi All, I've purchased a second hand eMac from a local College last summer and I've finally got around to connecting it to the internet (after 20 + years of working and using PCs). All is great or I thought??? There is a few problems that I still enc
-
Maximum file Size PI 7.1 Can handel
HI ALL, 1)any idea like the maximum file size PI 7.1 can handel if there is no File Content conversion .Say if PI is working like a FTP server .. if there is anymothed to calculate it in MB.. ? 2)Bacause i have a Flat file of size 400 MB .once i act
-
How to create .sda file for tables directly created in SQL ,not Dictionary
Hi friends, According to my project requirements, I need to create tables in a different schema than the default location provided. hence I have created my tables using the SQL client rather than creating the Dictionary project and then deploying. Bu
-
Why do you want all our info when will you put a box for what coulore pants i am wearing
why do you want all our info when will you put a box for what coulore pants i am wearing The only reason I can see you want this info for each install of office 365/2013 is so you know what every person in the world is upto sort it out microsoft
-
Guys, In UCWB => Master data => Subassignment => Movement type , I have Define as follows : Hierarchy node : Expense which is have : "Selling" and "GA" as the subassignments (Movement type). Then I have another Hierarchy node : OTHER which is have :