Suggestions for Very Large Hierarchy

Hello,
We have general ledger information with accounts and transactions that we are loading into BI, and we were told to use SAP's hierarchy structure - however that seems to be limited to 50,000 - 100,000 nodes, and the file we are using has 7.5 million records.  Does anyone have any suggestions as to how to set up a hierarchy of this size?
Thanks!

There are many reasons why we ALL get dropped frames on playback. Some have been addressed here.
For me, I always seem to get them when I'm viewing on external monitor and don't have things rendered properly. A small crossfade that "looks" rendered but is only actually 'kind-of-rendered' so it plays in RT can easily clog playback so you get dropped frame warnings. Do you have ANY non rendered things in the timeline? If you don't want to go through and trash your render files and re-render... at least do Command-A -- Option-R... the Option R should render out everything to FULL render and not just what FCP things it needs to render it to to get it to play in your RT settings... FCP's often wrong there.
CaptM

Similar Messages

  • How can we suggest a new DBA OCE certification for very large databases?

    How can we suggest a new DBA OCE certification for very large databases?
    What web site, or what phone number can we call to suggest creating a VLDB OCE certification.
    The largest databases that I have ever worked with barely over 1 Trillion Bytes.
    Some people told me that the results of being a DBA totally change when you have a VERY LARGE DATABASE.
    I could guess that maybe some of the following topics of how to configure might be on it,
    * Partitioning
    * parallel
    * bigger block size - DSS vs OLTP
    * etc
    Where could I send in a recommendation?
    Thanks Roger

    I wish there were some details about the OCE data warehousing.
    Look at the topics for 1Z0-515. Assume that the 'lightweight' topics will go (like Best Practices) and that there will be more technical topics added.
    Oracle Database 11g Data Warehousing Essentials | Oracle Certification Exam
    Overview of Data Warehousing
      Describe the benefits of a data warehouse
      Describe the technical characteristics of a data warehouse
      Describe the Oracle Database structures used primarily by a data warehouse
      Explain the use of materialized views
      Implement Database Resource Manager to control resource usage
      Identify and explain the benefits provided by standard Oracle Database 11g enhancements for a data warehouse
    Parallelism
      Explain how the Oracle optimizer determines the degree of parallelism
      Configure parallelism
      Explain how parallelism and partitioning work together
    Partitioning
      Describe types of partitioning
      Describe the benefits of partitioning
      Implement partition-wise joins
    Result Cache
      Describe how the SQL Result Cache operates
      Identify the scenarios which benefit the most from Result Set Caching
    OLAP
      Explain how Oracle OLAP delivers high performance
      Describe how applications can access data stored in Oracle OLAP cubes
    Advanced Compression
      Explain the benefits provided by Advanced Compression
      Explain how Advanced Compression operates
      Describe how Advanced Compression interacts with other Oracle options and utilities
    Data integration
      Explain Oracle's overall approach to data integration
      Describe the benefits provided by ODI
      Differentiate the components of ODI
      Create integration data flows with ODI
      Ensure data quality with OWB
      Explain the concept and use of real-time data integration
      Describe the architecture of Oracle's data integration solutions
    Data mining and analysis
      Describe the components of Oracle's Data Mining option
      Describe the analytical functions provided by Oracle Data Mining
      Identify use cases that can benefit from Oracle Data Mining
      Identify which Oracle products use Oracle Data Mining
    Sizing
      Properly size all resources to be used in a data warehouse configuration
    Exadata
      Describe the architecture of the Sun Oracle Database Machine
      Describe configuration options for an Exadata Storage Server
      Explain the advantages provided by the Exadata Storage Server
    Best practices for performance
      Employ best practices to load incremental data into a data warehouse
      Employ best practices for using Oracle features to implement high performance data warehouses

  • Grid Control Architecture for Very Large Sites: New Article published

    A new article on Grid Control was published recently:
    Grid Control Architecture for Very Large Sites
    http://www.oracle.com/technology/pub/articles/havewala-gridcontrol.html

    Oliver,
    Thanks for the comments. The article is based on practical experience. If one was to recommend a pool of 2 management servers for a large corporate with 1000 servers, what that would mean is that if 1 server was brought down for any maintenance reason (for eg. applying an EM patch), all the EM work load would be on the remaining management server. So it is better to have 3 management servers instead of 2 when the EM system is servicing so many targets. Otherwise, the DBAs would be a tad angry since only 1 remaining managment server would not be able to service them properly during the time of the maintainance fix on the first management server.
    The article ends with these words: "You can easily manage hundreds or even *thousands* of targets with such an architecture. The large corporate which had deployed this project scaled easily up to managing 600 to 700 targets with a pool of just three management servers, and the future plan is to manage *2,000 or more* targets which is quite achievable." The 2000 or more is based on the same architecture of 3 managment servers.
    So as per the best practice document, 2 management servers would be fine for 1000 servers, although I would still advise 3 servers in practice.
    For your case of 200 servers, it depends on the level of monitoring you are planning to do, and the type of database managment activities that
    will be done by the DBAs. For eg, if the Dbas are planning on creating standby databases now and then through Grid Control, running backups daily
    via Grid Control, cloning databases in Grid Control, patching databases in Grid Control and so on, I would definitely advise a pool of 2 servers
    in your case. 2 is always better than 1.
    Regards,
    Porus.
    Edited by: Porushh on Feb 21, 2009 12:51 AM

  • Using SRM for very large contracts and contract management

    We are doing an SRM 7.01 implementation project. SRM will be used primarily for outsourced contract management. The contracts are all services associated with facilitites (plant) maintenance and also support services like cleaning or catering.
    They have very large numbers of items priced individually (e.g. 10,000) per contract. The items price depends on the location the work is expected to be performed. The location is represented in SAP RE-FX architectual object. The price can be priced at any level of the hierarchy e.g. service A is priced the same across the whole state but service B is priced per campus.
    q1. SAP advises that there are performance limitations on SRM contracts >2000 lines. Has anyone experience in a solution to provide very large contracts in SRM? How did you do it please?
    q2. SAP advises to use the plant to represent the location for pricing purposes, but this would result in a very large number of plants. Has any one experience in alternative solutions to for variable location pricing in SRM contracts please? I.e. like integrating the RE-FX architectural object or similar into contract and PO line items.
    thanks very much

    Hi Prakash,
    SRM does provide the facility of contract management with the help of Purchase Contracts and Global Outline Agreements but it is used as part of the sourcing for materials and services. The materials or services have contracts against some given target value against which PO is released. The contract is based on a material number ( eithe material or a service) which will be used as a source of supply during the creation of the Shopping Cart. It might not really fit in the scenario of carrier and freight forwarders but still can be customized for this kind of use.
    The contract management functionalities in the R/3 space can also be looked on for this purpose.
    Reg
    Sachin

  • Setting resolution, deciding file type, for very LARGE Canvas prints. 36MP camera.

    Okay, so I noticed my lightroom was on 240 PPI resolution. I changed it to 300 because I read 300 was for standard prints. What would I need to set the export to in the resolution box for a very large Canvas?
    Is it better to choose Tiff instead of Jpeg for prints of this quality, if not then what should I choose?
    I am using a Sony A7R full frame 36.4MP and with some of the sharp Zeiss lens there is really no pixelation that noticeable when I zoom in 100 percent. Of course the A7R is being said to be one of the best Sensors on the market today. It's supposed to be like the Nikon D800E, but apparently it has some advantages.
    In other words, I want to export in as high of quality as possible for the Canvas. File size is not an issue.

    Changing the resolution setting does absolutely nothing to the digital image. This is a common misconception. The only thing that counts in the digital image is the pixel dimensions. Regardless of what the PPI setting is (240, 300, 600, whatever) the image still has the same number of pixels. To determine what you need for a print of any size it is necessary to multiply the inches by the desired pixels per inch. Suppose you want a 16 x 20" print at 300 pixels per inch. The math would be something like this:
    300x16 = 4800 pixels
    300x20 = 6000 pixels
    So to print a 16 x 20" print you would need an image that is 4800 x 6000 pixels. And the PPI setting can be anything you want it to be because it has no effect on the image.

  • Now I understand the reason for very large screen monitors

    I now understand why many people want to buy the new very large screen Macs. I have always loved my 17" flatscreen iMac G4 and since the picture itself is the same and the only thing that changes is the real estate around the picture, I always felt that getting a larger monitor would be an exercise in self-indulgence.
    Well, now I see that all of the surrounding real estate would be very useful for putting folders, documents, pictures, etc on the screen and having them visible and accessible. For someone making a webpage, and I imagine also for making iMovies, which I will soon do for my website, a 19" or larger screen would be VERY helpful. But when a person buys a new computer, then all of that information has to be transferred....... and that is enough to make a person stick with the 17" monitor (that and the price of a 21").
    — Lorna in Southern California

    Have you tried using Exposé to make that smaller
    screen more expansive? You can drag from one window
    to another with an Exposé transition in between.
    While not a replacement for a larger screen, it does
    help when I need it.
    Ken, I've had Tiger for about a week and the only things I've been working with are iWeb and iPhoto! Later I will explore Exposé. It sounds like they were trying to help us out and that's good.
    — Lorna in Southern California

  • Creating a custom DataBuffer for very LARGE images

    I would like to work with very large images (up to 100MB) in an application.
    Only for image manipulating, not rendering for screen.
    My idea is to write my own DataBuffer which use Hard Drive,maybe with some "intelligent" swapping.
    But At first performance doesn�t matter.
    I try this:ColorModel cm = ColorModel.getRGBdefault();
    SampleModel sm = cm.createCompatibleSampleModel(w,h);
    DataBuffer db =  new  myDataBufferInt(size);
    WritableRaster raster = Raster.createWritableRaster(sm,db,null);But somebody don�t like myDataBuffer
    and of course it is type DataBuffer.TYPE_INTthrows:java.awt.image.RasterFormatException: IntegerComponentRasters must haveinteger DataBuffers
    at sun.awt.image.IntegerComponentRaster.<init>(IntegerComponentRaster.java:155)
    at sun.awt.image.IntegerInterleavedRaster.<init>(IntegerInterleavedRaster.java:111)
    at sun.awt.image.IntegerInterleavedRaster.<init>(IntegerInterleavedRaster.java:78)
    at java.awt.image.Raster.createWritableRaster(Raster.java:980)
    public  class myDataBufferInt extends DataBuffer {
    public myDataBufferInt(int size) {
            super(TYPE_INT,size);
        . . . }The question is how to manage large image?
    What kind of subclasses I have to write?
    Thank you all
    P.S: I don�t want to use: java -Xmsn -Xmxn

    i have the same problem let me know please if you have an answer?

  • Import hangs for (very) large file

    Hi All,
    where importing a file that is converted to a single data column in the beffileimport script.
    So i have a flatfile with 250.000 rows but 4 data columns and i convert it to a file with 1.000.000 rows and 1 data column.
    In the inbox i can see this process takes about 12 minutes. Which is long but not a problem for us.
    After that the FDM web client (11.1.1.3) keeps showing processing, please wait... (even after a day).
    We saw that there was no activitiy on the server so closed the web client, logged on againg and the import action was succesfull..
    So for some reason because the import step takes so long it doesn't show a success message but keeps hanging.
    Any suggestions how to solve this?
    thanks in advance,
    Marc

    The only thing that I would be aware of that would cause this is inserting a timeout Registry key on the workstation as noted in the following microsoft KB. If you have this in place, remove it and reboot and then test the import again.
    http://support.microsoft.com/kb/181050

  • Best Practice for very large itunes and photo library..using Os X Server

    Ok setup....
    one Imac, one new Macbook Pro, one Macbook, all on leopard. Wired and wireless, all airport extremes and express'
    have purchased a mac mini plus a firewire 800 2TB Raid drive.
    I have a 190GB ever increasing music library (I rip one to one no compression) and a 300gb photo library.
    So..question Will it be easier to set up OS X Server on the mini and access my itunes library via that?
    Is it easy to do so?
    I only rip via the Imac, so the library is connected to that and shared to the laptops...how does one go about making the imac automatically connect to the music if i transfer all music to the server ?
    The photo bit can wait depending on the answer to the music..
    many thanks
    Adrian

    I have a much larger itunes collection (500gb/ 300k songs, a lot more photos, and several terabytes of movies). I share them out via a linux server. We use apple TV for music/video and the bottleneck appears to be the mac running itunes in the middle. I have all of the laptops (macbook pros) set up with their own "instance" of itunes that just references the files on the server. You can enable sharing on itunes itself, but with a library this size performance on things like loading cover art and browsing the library is not great. Please note also I haven't tried 8.x so there may be some performance enhancements that have improved things.
    There is a lag on accessing music/video on the server of a second or so. I suspect that this is due to speed in the mac accessing the network shares, but it's not bad and you never know it once the music starts or the video starts. Some of this on the video front may be the codec settings I used to encode the video.
    I suspect that as long as you are doing just music, this isn't going to be an issue for you with a mini. I also suspect that you don't need OSX server at all. You can just do a file share in OSX and give each machine a local itunes instance pointing back at the files on the server and have a good setup.

  • Suggestions for Chunking Large Outbound Web Service Messages from BPEL

    We have a problem today when Oracle EBS sends a large amount of data to BPEL, BPEL processes that data and then attempts to pass that data as a web service via a partner link to the PeopleSoft integration broker. When the message is too large ( > 10,000 records or so) we are hitting some issues on the PeopleSoft web server side being unable to receive that large of a message.
    "allocLargeObjectOrArray - Object size: 32768016, Num elements: 8192000"
    We are considering adjusting the BPEL program to chunk the outbound data being sent to PeopleSoft. The idea would be to use some type of loop and just break after so many lines, transmit the current message and then start a new message.
    The PeopleSoft integration broker has a setting, for outbound type integrations, that can enable an automated chunking feature called ‘Max App Message Size’. I was wondering, does BPEL have a similar configuration setting that might provide this feature? I’m dreaming up that this could be a property of the partner link object that could allow you to specify a certain ‘max size’ and then it would just know to start a new message at that threshold. That might avoid us having to make any additional BPEL code changes at this point.
    Currently using version 11.1.1.1.
    Any advice, much appreciated.
    Regards,
    Ken

    I have found the following tutorial that implements something similiar to what I am trying to do: http://www.oracle.com/webfolder/technetwork/tutorials/obe/fmw/odi/10g/10135/odiscenario_bpelcallback/odiscenario_bpelcallback.htm#t3
    Because of this, I am now confident that the conversation IDs can be used to achieve correlation. I have implemented the pattern by performing
    an invoke activity on the web service, passing the result of ora:getConversationId() as part of the message. The conversation Id returned is of UUID-Form.
    After the invoke activity, I have added a pick activity to receive the response message supplied by the web service through IDeliveryService.post(...). I can see that the message is received correctly by loooking at the contents of DLV_MESSAGE. However, the pick activity times out every time (after 10m). Looking at DLV_SUBSCRIPTION reveals that the conversation_id for the pick/receive activity is set to a value of the form bpel://localhost/default/MyBpelProcessName~1.0/7610001-BpInv0-BpSeq2.7-2. As far as I know, this should instead be set to the UUID that ora:getConversationId() returned before performing the invoke activity. What is going wrong here?
    Thanks for your help!

  • Please help with TV and home theater system for very large basement room....

    I'm looking to get my boyfriend either a TV or home theater system for the basement.  I know he eventually wants a 3d TV.  Right now he has a set up where he has 2 tv's so he can play video games and watch sports.
    The basement that the tv and home theater system will be in is large (I'd estimate at least 15 ft x 30 ft...maybe more) so I want to make sure the home theater system will be enough for the room.  We live in a ranch so basically the basement is almost the size of the house with some of it being storage.
    I've started to look at products online but there are so many and technology isn't really my thing.
    If anyone could help with recommendations for the home theater system and/or the tv I'd appreciate it!
    Thanks so so sooo much!!!!

    What will he be using the HT system for?  Just watching sports or skipping the movie theater and watching them at home.   If you plan to watch them at home your going to be on a tight budget with the home theater system but it is possible.   Spend good money on the sub, that is one speaker you don't want to skimp out on.  With such a big space you have a lot of room to fill and the cheap subs aren't going to cut it.   You will probably need two subs as it is.  The speakers I recommend on a tight budget Polk or Klipsch.  My list includes a Denon receiver as I am a Denon fan but Yamaha, Onkyo, or Pioneer will suffice.
    Here is a Option 1 will run you about 1700 with just one sub. You could always go up a model on each speaker if you feel the one sub is enough.  I like an explosion sounding like it happened in front of me so that is why I mention 2 subs.
    http://www.bestbuy.com/site/Denon+-+630W+7.1-Ch.+3D+Pass+Through+A/V+Home+Theater+Receiver/9894577.p...
    http://www.bestbuy.com/site/Polk+Audio+-+Dual+5-1/4%22+2-Way+Floor+Speakers+(Each)+-+Black/8825453.p... speakers&cp=1&lp=12
    http://www.bestbuy.com/site/Polk+Audio+-+5-1/4%22+Bookshelf+Speakers+(Pair)+-+Black/8825444.p?id=120... speakers&cp=1&lp=2
    http://www.bestbuy.com/site/Polk+Audio+-+5-1/4%22+Center-Channel+Speaker+-+Black/8826149.p?id=120735... speakers&cp=1&lp=28
    http://www.bestbuy.com/site/Speakers/Subwoofer-Speakers/abcat0205008.c?id=abcat0205008&&initialize=f...
    Option 2 will cost $1400 before a subwoofer
    http://www.bestbuy.com/site/Denon+-+630W+7.1-Ch.+3D+Pass+Through+A/V+Home+Theater+Receiver/9894577.p...
    http://www.bestbuy.com/site/Mirage+-+OS%26%23179%3B-CC+Ompipolar+3-Way+Center-Channel+Speaker+-+Blac... speaker&cp=1&lp=11
     and 4 of the below speaker
    http://www.bestbuy.com/site/Mirage+-+OS%26%23179%3B-SAT+4-1/2%22+2-Way+Satellite+Speaker+(Each)+-+Hi... speaker&cp=1&lp=15

  • Using swf file for very large static image

    i have a large photo that i need to use as a banner image in a website.  i have tried optimizing it for the web in fireworks -- to reduce the file size obviously -- but the client isn't happy with the resolution.  i haven't had this issue come up before but i see his point (to a degree), as the detail and sharpness of the original is far superior.
    i have almost no familiarity with creating with flash, but i remember reading somewhere that swf files do not have the same sizes as jpeg images.
    would it be feasible to include the single static image as a swf file to preserve its original resolution?  i mean, would that image as a swf file be smaller than the image as a jpeg?
    mille grazie.

    There are two things at play here... image quality (often harmed by compression) and image resolution (# of pixels width by height). What is the real problem?
    Whether you render your image as a GIF, JPG, PNG or SWF file, they will all have the same resolution.
    Packing an image in SWF should be a last resort as not everyone has Flash installed. For example, iPhone users cannot see SWF.

  • Suggestions for VERY sluggish timeline?

    Does anyone have any ideas for how to remedy a very sluggish timeline? I have searched this forum and I've seen problems with FCP 5 and Tiger that sound similar to mine, but I'm running FCP 4.5 and OS 10.3.9.
    My timeline has slowed to the point where I can barely use the drag and drop functions anymore. When I do, and two clips butt up next to eachother, I get the pinwheel for a few seconds before I can drop the clip and when I do, it generally shifts past where I wanted to drop it - even in snapping mode!
    You can forget about using slip and slide - these have been rendered useless. Right now, I am forced to use the blade tool, cut the selected clip out of the timeline and paste it in where I want it. What a pain!
    My timelines are not that long - only 20 minutes. the only suspicion I have is that all my media is running off of firewire drives, rather than internals. But it's the internal drive that is spinning when the timeline hangs up. The other thing I've noticed is that this doesn't happen nearly as badly when I am working on a 30fps timeline, as opposed to the 24fps timeline that I am working in now.
    Any ideas? It's not a processor issue (dual 2.5 ghz) or a RAM issue (2.5 ghz RAM). What's up?!???!

    There are many reasons why we ALL get dropped frames on playback. Some have been addressed here.
    For me, I always seem to get them when I'm viewing on external monitor and don't have things rendered properly. A small crossfade that "looks" rendered but is only actually 'kind-of-rendered' so it plays in RT can easily clog playback so you get dropped frame warnings. Do you have ANY non rendered things in the timeline? If you don't want to go through and trash your render files and re-render... at least do Command-A -- Option-R... the Option R should render out everything to FULL render and not just what FCP things it needs to render it to to get it to play in your RT settings... FCP's often wrong there.
    CaptM

  • Are Analytic Workspaces suitable for very large data sets?

    Hi all,
    I have made many different tests with analytic workspaces and i have used the different features (compression,composites...). The results especially for maintenance are disappointing.
    I have a star schema with 6 dimensions. The fact table has 730 million rows, the first dimension has 2,9 million rows and the other 5 dimensions have between 25 and 300 rows each.
    My conclusion is that Analytic Workspaces don't help in situations like mine. The time for maintenance is very very bad not to mention the time for aggregations. I even tried to populate the cube in parts( 90 million rows for the first population) but nothing change. And there are some other problems with storage and tablespaces ( I always get the message unable to extent TEMP tablespace. The size of it is 54Gb).
    Is there something i missing? Has anyone similar problem or different opinion?
    Thank you,
    Ilias

    A few other tips to add to Keith's excellent advice:
    - How many CPU's does your server have? The answer to this may help you decide the optimal level to partition at (in my experience DAY is too low and can cause different problems). What other levels does your time dimension have? Are you loading your cubes in parallel?
    - To speed up your load, partition your underlying fact table with the same granularity as your cubes and place an index on the field mapped to the partition dimension
    - Are you using 10.2.0.3? If so, be very careful with the storage data type you choose when creating your cubes. The default in 10.2.0.3 is NUMBER which has the capability of storing data to 38 significant figures. This usually exceeds what is required for most datasets. If your dataset allows you to use storage of 15 significant figures then you should create your cubes using the DECIMAL data type instead. This will use about one third of the storage space and significantly increase your build speeds (in my experience, more than 3 times faster)
    - Make sure you have preallocated enough permanent and temporary tablespaces for your build. Autoextending can be very time consuming.
    - Consider reducing the amount of aggregation you do in batch. It should not be necessary to pre-aggregate everything in order to get good query performance.
    Generally, I would say that the volume should not be a problem. A single dimension with 2.9 million values is fairly big and can be slow (in OLAP terms) to query but that should not be an obstacle to building it in the first place.
    Good luck!
    Stuart

  • Video Problems (need suggestions for playing large videos)

    I have a video that I need to play in Authorware. The piece
    starts out as a
    Photo Slide Show, and then a video will come up. This will be
    published to
    an EXE file, and loaded onto the computer's hard drive, for
    making a
    presentation of sorts.
    The Authorware piece will be published at 800 x 600, and I
    want the video to
    be as close to full-screen as possible.
    I currently have the video file in 4 versions:
    .wmv 720x480 (1,561 KB)
    .avi 720x480 (43,346 KB)
    .wmv 800x600 (2,295 KB)
    .avi 800x600 (656,711 KB)
    If I use any of these videos using the Digital Movie icon,
    the video will
    play in the published file, but there is always a few seconds
    of laggy video
    playback, and dropped frames. I'd like to use the 800x600
    .wmv file, because
    of it's much smaller file size. Is there another way to play
    this video that
    wouldn't have the laggy playback and skipped frames?
    If the answer is to use an ActiveX Control, I'd need someone
    to point me in
    the direction of some guide to using it, as I've never used
    any ActiveX
    Controls before.

    thanks, i'll check that out.
    "Gary Overgaard" <[email protected]> wrote in message
    news:e4cmt7$nb3$[email protected]..
    > It's too bad the DirectMedia Xtra has been discontinued,
    as that works
    > very well in Authorware at playing MPG, AVI and even WMV
    files. Very
    > smooth playback and lots of control options. In absence
    of that, the WMP
    > ActiveX control would probably be your best performing
    option. Steve
    > Gannon has a great WMP 10 ActiveX example on his website
    at
    >
    http://www.gantekmultimedia.com/download.htm
    >
    > Good luck
    > Gary
    >
    >
    >>I have a video that I need to play in Authorware. The
    piece starts out as
    >>a Photo Slide Show, and then a video will come up.
    This will be published
    >>to an EXE file, and loaded onto the computer's hard
    drive, for making a
    >>presentation of sorts.
    >>
    >> The Authorware piece will be published at 800 x 600,
    and I want the video
    >> to be as close to full-screen as possible.
    >>
    >> I currently have the video file in 4 versions:
    >>
    >> .wmv 720x480 (1,561 KB)
    >> .avi 720x480 (43,346 KB)
    >> .wmv 800x600 (2,295 KB)
    >> .avi 800x600 (656,711 KB)
    >>
    >> If I use any of these videos using the Digital Movie
    icon, the video will
    >> play in the published file, but there is always a
    few seconds of laggy
    >> video playback, and dropped frames. I'd like to use
    the 800x600 .wmv
    >> file, because of it's much smaller file size. Is
    there another way to
    >> play this video that wouldn't have the laggy
    playback and skipped frames?
    >>
    >> If the answer is to use an ActiveX Control, I'd need
    someone to point me
    >> in the direction of some guide to using it, as I've
    never used any
    >> ActiveX Controls before.
    >>
    >
    >

Maybe you are looking for

  • Writing a stored procedure to import SQL Server table data into a Oracle table

    Hello, As a new DBA I have been tasked with writing a stored procedure to import SQL Server table data into an Oracle table. I have been given many suggestions on how to do it from SQL Server but I I just need to write a stored procedure to run it fr

  • Picture not showing online pls help

    Hello I am a basic Dremweaver user. I am trying to add a picture to my webpage but online when I check the images space flashes but does not show the image. It is correctly loaded on the server and local and linked cprrectly SRC. Can anybody help ple

  • Reverse logo not printing

    I have a BW logo created in Illustrator that contains shapes and outlined type. When I reverse the black and white sections of the logo -- in order to make a reverse logo to run on top of a black background -- several portions no longer print. As in

  • Can I load photos stored on a pc backup hardrive onto my Mac IPhoto?

    Can I load photos stored on a pc backup hardrive onto my Mac IPhoto?

  • Implementing IN operator with array binding

    Is there a way to have an SQL statement with a simple WHERE clause like "WHERE my_col = :p1" or "WHERE my_col IN (:p1)" and bind an array to p1? My goal is to avoid blowing up of the command text like "WHERE my_col IN (:p1, :p2, :p3 ...)". In my case