Animating Large Number of Small Objects in CS3
A client originally asked me to animate a single butterfly in a seven-second shot.
Now he wants lots and lots of butterflies flying in the scene, like a mini butterfly swarm.
Is there an easy way to take the butterfly that I've already created and put lots of them into the scene, without making the process too laborious? I don't want to take the time to import a hundred butterflies in the comp and specify a motion path for each one.
Is there a tutorial available anywhere for this type of procedure?
thx
>So this is something that can't be done in AE without buying plug-ins?
Not at all. Just a third party plugin happens to be the best and most flexible tool for the job (assuming you don't want to go to a 3D program for this effect). You can get a bunch of butterflies onscreen using Foam, but your only choice is for them to flap their wings in unison, whereas Particular will allow you to sample the time of each butterfly individually, even doing subframe sampling. Also, the effect will be strictly 2D, where Particular's particles inhabit 3D space (though the particles themselves are still 2D images). CC Particle World can do 3D particle simulations, but is pretty restrictive as to how the particles move, and while its particles have 3D orientation, this can ironically make them look more fake, since what you get is a flat 2D image rotating in 3D space.
And Particle Playground, while flexible in a lot of ways, is just a world of hurt to actually use. It is very slow, it can't do some pretty basic things one expects a particle system to do, and it requires elaborate layer maps to control parameters that just require tweaking a couple of numerical properties in other particle systems. And it's limited to 2D. I'm pretty sure it's only still included in AE for the sake of backward-compatibility.
And no, Particular isn't slow, it's actually quite speedy as particle systems go (at least in my experience).
Similar Messages
-
Hi All,
How the process "splitting a large purchase order into smaller order based on certain criteria and processed together such that one order block ,blocks all the order" handled efficiently in SAP
Would appreciate your inputs on the above.
Thank you.
Abhay.Hi ,
The scenario is that:
1.Customer sends a big purchase order.
2.This purchase order is then split up into smaller order based on certain criteria and processed together.
3.These orders are then shipped together(This is achieved by customized developments at various order processing stages).
Important points to note: If any of the processing step fails for any of the orders, the further processing of all the orders should stop.
This is affecting the system performance to greater extent.(due to large volume of orders)
How can the system efficiency increased by existing way or other way.
Thank you
Abhay. -
Simultaneous change of properties of a large number of objects
Hi there,
is there a way to change the properties, especially the qualification, of a large number (>100) of objects in Universe Designer simultaneuosly?
Or is there another effecient way to import a file that consists of only 5 dimensions and 100 measurements?
Best regards,
UweWhat exactly do you mena by Qualification ?
Do you want to change the object definition from
tablename.columnname
to
schema.tablename.columnname ?
Regards,
Stratos -
Querying for large number of objects... searchspec limitation
As part of a product i'm developing i may come across a scenario where i need to query for 100+ specific objects based on ids.
I know the query input for WS 2.0 has a "searchspec" string field, but based on the sheer number of specific objects i need to query for i'm afraid the string may eventually get too large.
Is there a way around this? Can i send multiple individual queries in a batch request? Can i add more than one search object to a single query page request? Anything?
Thanks!
-KevinA few options available using the WS v2.0 Query methods:
1. Use Arguments like page size and startrownumber arguments. This will allow you to specify the pagesize of the recordset to be returned and also the starting row number.
2. The searchspec is a powerful argument and it supports a set of binary and unary operators. Refer the Ondemand user guide for a more complete set of operators supported by searchspec. To narrow the results of your query, you could use the "AND" operator between 2 or more fields in your object query.
Hope this helps.
Jaya -
Trouble copying a large number of objects using Acrobat X
Acrobat X is many times slower than Acrobat 9 when copying a large number of objects in a PDF. What used to take one second in Acrobat 9, now takes upwards of 45 seconds or longer in Acrobat X and often causes the application to crash. I am using 10.2.1 on OS X 10.6.8. Has anyone else experienced this performace difference or have any solutions? Any thoughts on the subject would be much appreciated. Thanks.
Since you do not want to crop your images to a square 1:1 aspect ratio changing the canvas to be square will not make your images square they will retain their Aspect Ratio and image size will be changer to fit within your 1020 px square. There will be a border or borders on a side or two borders on opposite sides. You do not need a script because Photoshop ships with a Plug-in script to be used in Actions. What is good about Plugins is the support Actions. When you record the action the plug-in during action recording records the setting you use in its dialog into the actions step. When the Action is played the Plug-in use the recorded setting an bypasses displaying its dialog. So the Action can be Batch. The Action you would record would have two Steps. Step 1 menu File>Automate>Fit Image... in the Fit Image dialog enter 1020 in the width and height fields. Step 2 Canvas size enter 1020 pixels in width and height not relative leave the anchor point centered it you want even borders on two sides set color to white in the canvas size dialog. You can batch the action.
The above script will also work. Its squares the document then re-sizes to 1020x1020 the action re-sizes the image to fit with in an area 1020 x 1020 then add any missing canvas. The script like the action only process one image so it would also need to be batched. Record the script into and action and batch the action. As the author wrote. The script re size canvas did not specify an anchor point so the default center anchor point is uses like the action canvas will be added to two sides. -
Large number of JSP performance
Hi,
a colleague of me made tests with a large number of JSP and identified a
performance problem.
I believe I found a solution to his problem. I tested it with WLS 5.1 SP2
and SP3 and MS jview SDK 4.
The issue was related to the duration of the initial call of the nth JSP,
our situation as we are doing site hosting.
The solution is able to perform around 14 initial invocations/s no matter if
the invocation is the first one or the
3000th one and the throughput can go up to 108 JSPs/s when the JSP are
already loaded, the JSPs being the
snoopservlet example copied 3000 times.
The ratio have more interest than the values as the test machine (client and
WLS 5.1) was a 266Mhz laptop.
I repeat the post of Marc on 2/11/2000 as it is an old one:
Hi all,
I'm wondering if any of you has experienced performance issue whendeploying
a lot of JSPs.
I'm running Weblogic 4.51SP4 with performance pack on NT4 and Jdk1.2.2.
I deployed over 3000 JSPs (identical but with a distinct name) on myserver.
I took care to precompile them off-line.
To run my tests I used a servlet selecting randomly one of them and
redirecting the request.
getServletContext().getRequestDispatcher(randomUrl).forward(request,response);
The response-time slow-down dramaticaly as the number of distinct JSPs
invoked is growing.
(up to 100 times the initial response time).
I made some additional tests.
When you set the properties:
weblogic.httpd.servlet.reloadCheckSecs=-1
weblogic.httpd.initArgs.*.jsp=..., pageCheckSeconds=-1, ...
Then the response-time for a new JSP seems linked to a "capacity increase
process" and depends on the number of previously activated JSPs. If you
invoke a previously loaded page the server answers really fast with no
delay.
If you set previous properties to any other value (0 for example) the
response-time remains bad even when you invoke a previously loaded page.SOLUTION DESCRIPTION
Intent
The package described below is design to allow
* Fast invocation even with a large number of pages (which can be the case
with Web Hosting)
* Dynamic update of compiled JSP
Implementation
The current implementation has been tested with JDK 1.1 only and works with
MS SDK 4.0.
It has been tested with WLS 5.1 with service packs 2 and 3.
It should work with most application servers, as its requirements are
limited. It requires
a JSP to be able to invoke a class loader.
Principle
For a fast invocation, it does not support dynamic compilation as described
in the JSP
model.
There is no automatic recognition of modifications. Instead a JSP is made
available to
invalidate pages which must be updated.
We assume pages managed through this package to be declared in
weblogic.properties as
weblogic.httpd.register.*.ocg=ocgLoaderPkg.ocgServlet
This definition means that, when a servlet or JSP with a .ocg extension is
requested, it is
forwarded to the package.
It implies 2 things:
* Regular JSP handling and package based handling can coexist in the same
Application Server
instance.
* It is possible to extend the implementation to support many extensions
with as many
package instances.
The package (ocgLoaderPkg) contains 2 classes:
* ocgServlet, a servlet instantiating JSP objects using a class loader.
* ocgLoader, the class loader itself.
A single class loader object is created.
Both the JSP instances and classes are cached in hashtables.
The invalidation JSP is named jspUpdate.jsp.
To invalidate an JSP, it has simply to remove object and cache entries from
the caches.
ocgServlet
* Lazily creates the class loader.
* Retrieves the target JSP instance from the cache, if possible.
* Otherwise it uses the class loader to retrieve the target JSP class,
create a target JSP
instance and stores it in the cache.
* Forwards the request to the target JSP instance.
ocgLoader
* If the requested class has not the extension ocgServlet is configured to
process, it
behaves as a regular class loader and forwards the request to the parent
or system class
loader.
* Otherwise, it retrieves the class from the cache, if possible.
* Otherwise, it loads the class.
Do you thing it is a good solution?
I believe that solution is faster than standard WLS one, because it is a
very small piece of code but too because:
- my class loader is deterministic, if the file has the right extension I
don't call the classloader hierarchy first
- I don't try supporting jars. It has been one of the hardest design
decision. We definitely need a way to
update a specific page but at the same time someone post us NT could have
problems handling
3000 files in the same directory (it seems he was wrong).
- I don't try to check if a class has been updated. I have to ask for
refresh using a JSP now but it could be an EJB.
- I don't try to check if a source has been updated.
- As I know the number of JSP I can set pretty accurately the initial
capacity of the hashtables I use as caches. I
avoid rehash.Use a profiler to find the bottlenecks in the system. You need to determine where the performance problems (if you even have any) are happening. We can't do that for you.
-
Large number of JSP performance [repost for grandemange]
Hi,
a colleague of me made tests with a large number of JSP and identified a
performance problem.
I believe I found a solution to his problem. I tested it with WLS 5.1 SP2
and SP3 and MS jview SDK 4.
The issue was related to the duration of the initial call of the nth JSP,
our situation as we are doing site hosting.
The solution is able to perform around 14 initial invocations/s no matter if
the invocation is the first one or the
3000th one and the throughput can go up to 108 JSPs/s when the JSP are
already loaded, the JSPs being the
snoopservlet example copied 3000 times.
The ratio have more interest than the values as the test machine (client and
WLS 5.1) was a 266Mhz laptop.
I repeat the post of Marc on 2/11/2000 as it is an old one:
Hi all,
I'm wondering if any of you has experienced performance issue whendeploying
a lot of JSPs.
I'm running Weblogic 4.51SP4 with performance pack on NT4 and Jdk1.2.2.
I deployed over 3000 JSPs (identical but with a distinct name) on myserver.
I took care to precompile them off-line.
To run my tests I used a servlet selecting randomly one of them and
redirecting the request.
getServletContext().getRequestDispatcher(randomUrl).forward(request,response);
The response-time slow-down dramaticaly as the number of distinct JSPs
invoked is growing.
(up to 100 times the initial response time).
I made some additional tests.
When you set the properties:
weblogic.httpd.servlet.reloadCheckSecs=-1
weblogic.httpd.initArgs.*.jsp=..., pageCheckSeconds=-1, ...
Then the response-time for a new JSP seems linked to a "capacity increase
process" and depends on the number of previously activated JSPs. If you
invoke a previously loaded page the server answers really fast with no
delay.
If you set previous properties to any other value (0 for example) the
response-time remains bad even when you invoke a previously loaded page.SOLUTION DESCRIPTION
Intent
The package described below is design to allow
* Fast invocation even with a large number of pages (which can be the case
with Web Hosting)
* Dynamic update of compiled JSP
Implementation
The current implementation has been tested with JDK 1.1 only and works with
MS SDK 4.0.
It has been tested with WLS 5.1 with service packs 2 and 3.
It should work with most application servers, as its requirements are
limited. It requires
a JSP to be able to invoke a class loader.
Principle
For a fast invocation, it does not support dynamic compilation as described
in the JSP
model.
There is no automatic recognition of modifications. Instead a JSP is made
available to
invalidate pages which must be updated.
We assume pages managed through this package to be declared in
weblogic.properties as
weblogic.httpd.register.*.ocg=ocgLoaderPkg.ocgServlet
This definition means that, when a servlet or JSP with a .ocg extension is
requested, it is
forwarded to the package.
It implies 2 things:
* Regular JSP handling and package based handling can coexist in the same
Application Server
instance.
* It is possible to extend the implementation to support many extensions
with as many
package instances.
The package (ocgLoaderPkg) contains 2 classes:
* ocgServlet, a servlet instantiating JSP objects using a class loader.
* ocgLoader, the class loader itself.
A single class loader object is created.
Both the JSP instances and classes are cached in hashtables.
The invalidation JSP is named jspUpdate.jsp.
To invalidate an JSP, it has simply to remove object and cache entries from
the caches.
ocgServlet
* Lazily creates the class loader.
* Retrieves the target JSP instance from the cache, if possible.
* Otherwise it uses the class loader to retrieve the target JSP class,
create a target JSP
instance and stores it in the cache.
* Forwards the request to the target JSP instance.
ocgLoader
* If the requested class has not the extension ocgServlet is configured to
process, it
behaves as a regular class loader and forwards the request to the parent
or system class
loader.
* Otherwise, it retrieves the class from the cache, if possible.
* Otherwise, it loads the class.
Do you thing it is a good solution?
I believe that solution is faster than standard WLS one, because it is a
very small piece of code but too because:
- my class loader is deterministic, if the file has the right extension I
don't call the classloader hierarchy first
- I don't try supporting jars. It has been one of the hardest design
decision. We definitely need a way to
update a specific page but at the same time someone post us NT could have
problems handling
3000 files in the same directory (it seems he was wrong).
- I don't try to check if a class has been updated. I have to ask for
refresh using a JSP now but it could be an EJB.
- I don't try to check if a source has been updated.
- As I know the number of JSP I can set pretty accurately the initial
capacity of the hashtables I use as caches. I
avoid rehash.
Cheers - WeiI dont know the upper limit, but I think 80 is too much. I have never used more than 15-20. For Nav attributes, a seperate tables are created which causes the Performance issue as result in new join at query run time. Just ask your business guy, if these can be reduced.One way could be to model these attributes as seperate characteristics. It will certainly help.
Thanks...
Shambhu -
Animation (Growing Number)
I am trying to make a transition/animation from a smaller to a larger number. How do you do that?
They did that in 2009 Keynote Event at about 6min 24sec
http://www.youtube.com/watch?v=Or1m7Aqgb8c
Appreciate any help!! Thanks.Hi,
Welcome to discussions.
From what I understand you are looking to have an animation effect that builds out a number and builds in a new number.
If so, I think they used the scale effect found under the effect menu in the animation section.
Just choose your first item and build out scale effect. Than select the second item and choose build in scale animation.
Your last step will be to time time so the second item will start after the first animation.
Hope this helps you,
Ziv -
How to handle a large number of query parameters for a Browse screen
I need to implement an advanced search functionality in a browse screen for a large table. The table has 80+ columns and therefore will have a large number of possible query parameters. The screen will be built on a modeled query with all
of the parameters marked as optional. Given the large number of parameters, I am thinking that it would be better to use a separate screen to receive the parameter input from the user, rather than a Popup. Is it possible for example to have a search
button on the browse screen (screen a) open a new screen (screen b) that contains all of the search parameters, have the user enter the parameters they want, then click a button to send all of the parameters back to screen a where the query is executed and
the search results are returned to the table control? This would effectively make screen b an advanced modal window for screen a. In addition, if the user were to execute the query, then want to change a parameter, they would need to be able to
re-open screen b and have all of their original parameters still set. How would you implement this, or otherwise deal with a large number of optional query parameters in the html client? My initial thinking is to store all of the parameters in
an object and use beforeShown/afterClosed to pass them between the screens, but I'm not quite sure how to make that work. TIAWow Josh, thanks. I have a lot of reading to do. What I ultimately plan to do with this (my other posts relate to this too), is have a separate screen for advanced filtering that also allows the user to save their queries if desired.
There is an excellent way to get at all of the query information in the Query_Executed() method. I just put an extra Boolean parameter in the query called "SaveQuery" and when true, the Query_Executed event triggers an entry into a table with
the query name, user name, and parameter value pairs that the user entered. Upon revisiting the screen, I want the user to be able to select from their saved queries and load all the screen parameters (screen properties) from their selected query.
I almost have it working. It may be as easy as marking all of the screen properties that are query parameters as screen parameters (not required), then passing them in from the saved query data (filtered by username, queryname, and selected
item). I'll post an update once I get it. Probably will have some more questions as I go through it. Thanks again! -
How to calculate the area of a large number of polygons in a single query
Hi forum
Is it possible to calculate the area of a large number of polygons in a single query using a combination of SDO_AGGR_UNION and SDO_AREA? So far, I have tried doing something similar to this:
select sdo_geom.sdo_area((
select sdo_aggr_union ( sdoaggrtype(mg.geoloc, 0.005))
from mapv_gravsted_00182 mg
where mg.dblink = 521 or mg.dblink = 94 or mg.dblink = 38 <many many more....>),
0.0005) calc_area from dualThe table MAPV_GRAVSTED_00182 contains 2 fields - geoloc (SDO_GEOMETRY) and dblink (Id field) needed for querying specific polygons.
As far as I can see, I need to first somehow get a single SDO_GEOMETRY object and use this as input for the SDO_AREA function. But I'm not 100% sure, that I'm doing this the right way. This query is very inefficient, and sometimes fails with strange errors like "No more data to read from socket" when executed from SQL Developer. I even tried with the latest JDBC driver from Oracle without much difference.
Would a better approach be to write some kind of stored procedure, that adds up all the single geometries by adding each call to SDO_AREA on each single geometry object - or what is the best approach?
Any advice would be appreciated.
Thanks in advance,
JacobHi
I am now trying to update all my spatial table with SRID's. To do this, I try to drop the spatial index first to recreate it after the update. But for a lot of tables I can't drop the spatial index. Whenever I try to DROP INDEX <spatial index name>, I get this error - anyone know what this means?
Thanks,
Jacob
Error starting at line 2 in command:
drop index BSSYS.STIER_00182_SX
Error report:
SQL Error: ORA-29856: error occurred in the execution of ODCIINDEXDROP routine
ORA-13249: Error in Spatial index: cannot drop sequence BSSYS.MDRS_1424B$
ORA-13249: Stmt-Execute Failure: DROP SEQUENCE BSSYS.MDRS_1424B$
ORA-29400: data cartridge error
ORA-02289: sequence does not exist
ORA-06512: at "MDSYS.SDO_INDEX_METHOD_10I", line 27
29856. 00000 - "error occurred in the execution of ODCIINDEXDROP routine"
*Cause: Failed to successfully execute the ODCIIndexDrop routine.
*Action: Check to see if the routine has been coded correctly.
Edit - just found the answer for this in MetaLink note 241003.1. Apparently there is some internal problem when dropping spatial indexes, some objects gets dropped that shouldn't be. Solution is to manually create the sequence it complains it can't drop, then it works... Weird error. -
Best practice for handling data for a large number of indicators
I'm looking for suggestions or recommendations for how to best handle a UI with a "large" number of indicators. By large I mean enough to make the block diagram quite large and ugly after the data processing for each indicator is added. The data must be "unpacked" and then decoded, e.g., booleans, offset binary bit fields, etc. The indicators are updated once/sec. I am leanding towards a method that worked well for me previously, that is, binding network shared variables to each indicator, then using several sub-vis to process the particular piece of data and write to the appropriate variables.
I was curious what others have done in similar circumstances.
Bill
“A child of five could understand this. Send someone to fetch a child of five.”
― Groucho Marx
Solved!
Go to Solution.I can certainly feel your pain.
Note that's really what is going on in that png You can see the Action Engine responsible for updating the display to the far right.
In my own defence: the FP concept was presented to the client's customer before they had a person familliar with LabVIEW identified. So I worked it this way from no choice of mine. I knew it would get ugly before I walked in the door and chose to meet the challenge head on anyway. Defer Panel Updates was my very good friend. The sensors these objects represent were constrained to pass info via a single ZigBee network so I had the benefit of fairly low data rates as well but even changing view (Yes there is a display mode that swaps what information is displayed for each sensor) was updated fast enough that the user still got a responsive GUI.
(the GUI did scale poorly though! That is a lot of wires! I was greateful to Jack for the Idea to make align and distribute work on wires)
Jeff -
How to add a large number of keywords to the e-mail filter?
Hello.
I would like to know how to add a large number of keywords to a filter.
The thing I want to accomplish, is it to add around 4000 e-mail addresses to a filter list, which checks the body of incoming e-mails, which get forwarded to me.
I don't want to outright delete them, but I would love it if it detects that the forwarded message contains one of the e-mail addresses, it would add a tag to the message.
Is it in any way possible to make a filter like this, which doesn't slow Thunderbird down to ass-crawl speed?
I tried to copy the whole list into the small filter tab, but It had no discernible effect on my messages, since some of the previously received ones, which I was sure contained the keywords, were not tagged. All it did was make the program super slow and I was forced to delete the filter.You can look at creating a exclusion crawl rule:
http://technet.microsoft.com/en-us/library/jj219686(v=office.15).aspx
You can also modify your content source starting addresses and remove onedrive:
http://technet.microsoft.com/en-us/library/jj219808(v=office.15).aspx
Blog | SharePoint Field Notes Dev Tools |
SPFastDeploy | SPRemoteAPIExplorer -
Hello!
I have been using Aperture for years, and have just one small problem. There have been many times where I want to have multiple versions of a large number of images. I like to do a color album and B&W album for example.
Previously, I would click on all the images at one, and select new version. The problem is this puts all of the new versions in a stack. I then have to open all the stacks, and one by one move the new versions to a different album. Is there any way to streamline this proccess? When it's only 10 images, it's no problem. When it's a few hundred (or more) its rather time consuming..
What I'm hoping for is a way to either automatically have new versions populate a separate album, or for a way to easily select all the new versions I create at one time, and simply move them with as few steps as possible to a new destination.
Thanks for any help,
RicardoRicardo,
in addition to Kirby's and phosgraphis's excellent suggestions, you may want to use the filters to further restrict your versions to the ones you want to access.
For example, you mentioned
I like to do a color album and B&W album for example.
You could easily separate the color versions from the black-and-white versions by using the filter rule:
Adjustment includes Black&white
or
Adjustment does not include Black&white
With the above filter setting (Add rule > Adjustment includes Black&White) only the versions with Black&White adjustment are shown in the Browers. You could do similar to separate cropped versions from uncropped ones.
Regards
Léonie -
Create a large number of purchase order in ME21N
Hello,
Is there a CATT or LSMW transaction or a program to create a large number of purchase orders with the positions?
Thanks in advice
Fanyyou can LSMW with direct input method
Object - 0085, method - 0001
venkat -
Approach to parse large number of XML files into the relational table.
We are exploring the option of XML DB for processing a large number of files coming same day.
The objective is to parse the XML file and store in multiple relational tables. Once in relational table we do not care about the XML file.
The file can not be stored on the file server and need to be stored in a table before parsing due to security issues. A third party system will send the file and will store it in the XML DB.
File size can be between 1MB to 50MB and high performance is very much expected other wise the solution will be tossed.
Although we do not have XSD, the XML file is well structured. We are on 11g Release 2.
Based on the reading this is what my approach.
1. CREATE TABLE XML_DATA
(xml_col XMLTYPE)
XMLTYPE xml_col STORE AS SECUREFILE BINARY XML;
2. Third party will store the data in XML_DATA table.
3. Create XMLINDEX on the unique XML element
4. Create views on XMLTYPE
CREATE OR REPLACE FORCE VIEW V_XML_DATA(
Stype,
Mtype,
MNAME,
OIDT
AS
SELECT x."Stype",
x."Mtype",
x."Mname",
x."OIDT"
FROM data_table t,
XMLTABLE (
'/SectionMain'
PASSING t.data
COLUMNS Stype VARCHAR2 (30) PATH 'Stype',
Mtype VARCHAR2 (3) PATH 'Mtype',
MNAME VARCHAR2 (30) PATH 'MNAME',
OIDT VARCHAR2 (30) PATH 'OID') x;
5. Bulk load the parse data in the staging table based on the index column.
Please comment on the above approach any suggestion that can improve the performance.
Thanks
AnuragTThanks for your response. It givies more confidence.
Oracle Database 11g Enterprise Edition Release 11.2.0.3.0 - 64bit Production
TNS for Linux: Version 11.2.0.3.0 - Production
Example XML
<SectionMain>
<SectionState>Closed</SectionState>
<FunctionalState>CP FINISHED</FunctionalState>
<CreatedTime>2012-08</CreatedTime>
<Number>106</Number>
<SectionType>Reel</SectionType>
<MachineType>CP</MachineType>
<MachineName>CP_225</MachineName>
<OID>99dd48cf-fd1b-46cf-9983-0026c04963d2</OID>
</SectionMain>
<SectionEvent>
<SectionOID>99dd48cf-2</SectionOID>
<EventName>CP.CP_225.Shredder</EventName>
<OID>b3dd48cf-532d-4126-92d2</OID>
</SectionEvent>
<SectionAddData>
<SectionOID>99dd48cf2</SectionOID>
<AttributeName>ReelVersion</AttributeName>
<AttributeValue>4</AttributeValue>
<OID>b3dd48cf</OID>
</SectionAddData>
- <SectionAddData>
<SectionOID>99dd48cf-fd1b-46cf-9983</SectionOID>
<AttributeName>ReelNr</AttributeName>
<AttributeValue>38</AttributeValue>
<OID>b3dd48cf</OID>
<BNCounter>
<SectionID>99dd48cf-fd1b-46cf-9983-0026c04963d2</SectionID>
<Run>CPFirstRun</Run>
<SortingClass>84</SortingClass>
<OutputStacker>D2</OutputStacker>
<BNCounter>54605</BNCounter>
</BNCounter>
I was not aware of Virtual column but looks like we can use it and avoid creating views by just inserting directly into
the staging table using virtual column.
Suppose OID id is the unique identifier of each XML FILE and I created virtual column
CREATE TABLE po_Virtual OF XMLTYPE
XMLTYPE STORE AS BINARY XML
VIRTUAL COLUMNS
(OID_1 AS (XMLCAST(XMLQUERY('/SectionMain/OID'
PASSING OBJECT_VALUE RETURNING CONTENT)
AS VARCHAR2(30))));
1. My question is how then I will write this query by NOT USING COLMUN XML_COL
SELECT x."SECTIONTYPE",
x."MACHINETYPE",
x."MACHINENAME",
x."OIDT"
FROM po_Virtual t,
XMLTABLE (
'/SectionMain'
PASSING t.xml_col <--WHAT WILL PASSING HERE SINCE NO XML_COL
COLUMNS SectionType VARCHAR2 (30) PATH 'SectionType',
MachineType VARCHAR2 (3) PATH 'MachineType',
MachineName VARCHAR2 (30) PATH 'MachineName',
OIDT VARCHAR2 (30) PATH 'OID') x;
2. Insetead of creating the view then Can I do
insert into STAGING_table_yyy ( col1 ,col2,col3,col4,
SELECT x."SECTIONTYPE",
x."MACHINETYPE",
x."MACHINENAME",
x."OIDT"
FROM xml_data t,
XMLTABLE (
'/SectionMain'
PASSING t.xml_col <--WHAT WILL PASSING HERE SINCE NO XML_COL
COLUMNS SectionType VARCHAR2 (30) PATH 'SectionType',
MachineType VARCHAR2 (3) PATH 'MachineType',
MachineName VARCHAR2 (30) PATH 'MachineName',
OIDT VARCHAR2 (30) PATH 'OID') x
where oid_1 = '99dd48cf-fd1b-46cf-9983';<--VIRTUAL COLUMN
insert into STAGING_table_yyy ( col1 ,col2,col3
SELECT x."SectionOID",
x."EventName",
x."OIDT"
FROM xml_data t,
XMLTABLE (
'/SectionMain'
PASSING t.xml_col <--WHAT WILL PASSING HERE SINCE NO XML_COL
COLUMNS SectionOID PATH 'SectionOID',
EventName VARCHAR2 (30) PATH 'EventName',
OID VARCHAR2 (30) PATH 'OID',
) x
where oid_1 = '99dd48cf-fd1b-46cf-9983';<--VIRTUAL COLUMN
Same insert for other tables usind the OID_1 virtual coulmn
3. Finaly Once done how can I delete the XML document from XML.
If I am using virtual column then I beleive it will be easy
DELETE table po_Virtual where oid_1 = '99dd48cf-fd1b-46cf-9983';
But in case we can not use the Virtual column how we can delete the data
Thanks in advance
AnuragT
Maybe you are looking for
-
How to implement JSF text input down-arrow key = next record navigation?
Hi All, I've got a small query which you experts might be able to help me through. I would not be surprised if such questions would already have been raised in this forum. Formerly, I've created an Oracle Form that allow user to quickly insert data i
-
Hyperlink not on top of page, hyperlink not on top of page
I have just created my website for my very own small business, and wanted to add hyperlinks. The problem is that when I created the page for the hyperlink, it has appeared in the menu bar across the top. I am not sure what I am doing wrong. I need to
-
Completely clearing out a child?
I've having a problem when I am removing a child. There is an area where you can choose a button to click(there are multiple buttons here). Once you click a button it loads workss_mc and you can go back from there and workss_mc is being removed the
-
Playing .swf files in Java code
How can is run a .swf in Java code. And change the .swf at runtime. I tried Using the JFlashPlayer api. but it does not work properly. I have a set of different .swf files. I need to run these files using a next/ previous button .... which changes th
-
I am trying to add text to an I photo book project. I have entered the text inside the blue lines which indicate the boundaries but have got a red triangle showing I have exceeded the text. Any ideas please?