How to export to very large dimensions?
I need to export a file to the specs below, however, every time I try I get a 'Error compiling movie - Unknown error'.
Resolution 2732 pixels (w) x 768 pixels (h).
32:9 Landscape format.
File format is MPEG-4/H.264 AVC at 20Mbps.
Frame rate is 25 as per PAL standard.
This error has been replicated across a number of fairly high spec computers so I don't think it is an issue with disk space etc. We're working off the assumption that it's the large dimensions causing the problem.
The only solution we can come up with is to export to smaller dimensions (which we have done successfully) and then upscale but even that is proving challenging!
Any suggestions or ideas are very welcome!
Hi,
I was also unable to reproduce the error using your export specs.
Could you provide answers to these:
Are you able to reproduce this issue with multiple projects?
Did you try increasing the "level" to 5.1 under video settings?
What is the specs of original media used in the project?
Regards
Vipul
Similar Messages
-
Rapidly changing very large dimensions in OWB
Hello,
How is rapidly changing very large dimensions supported in OWB?
Is it supported directly by the tool or do we have to depend on PL/SQL?
If supported, Is it supported by all versions of OWB from 9.0.3 thru 10g?
TIAHi
Use merge (insert/update or update/insert) if you have to update and insert too.
Ott Karesz
http://www.trendo-kft.hu -
How do share a very large file?
How do share a very large file?
Do you want to send a GarageBand project or the bounced audio file? To send an audio file is not critical, but if you want to send the project use "File > Compress" to create a .zip file of the project before you send it.
If you have a Dropbox account, I'd simply copy the file into the Dropbox "public" folder and mail the link. Right-click the file in the Dropbox, then choose Dropbox > Copy Public Link. This copies an Internet link to your file that you can paste anywhere: emails, instant messages, blogs, etc.
2 GB on Dropbox are free. https://www.dropbox.com/help/category/Sharing -
How to Upload a Very Large Video on my MacBook?
I have a large video file on my camera and both iphoto and image capture won't upload it. The camera is a Canon Powershot SD 750. Both iphoto and image capture say they are uploading, but after a while iphoto says it is done without uploading the video, and image capture just sits there and says it is transferring and never finishes. The video is probably a little under 2gb in size. I currently have around 4gb of space in my computer hard drive.
Any other suggestions on how to get it onto my computer?Welcome to the forum!
First and foremost, ALWAYS leave at least 10% free space on ALL hard drives ... especially the boot drive. The OS uses the free space for temp and swap files. Without enough free space things will start getting weird.
Since there's isn't enough free space on your boot drive, delete any unneeded files or move them to another drive.
-DH -
I just upgraded my Aperture app but now it will not allow me to make inches wide brush strokes when I need to lighten, darken or other adjustments. In the previous version I could make inches wide adjustments. I really need to do this again. How can I do this?
The brush size is measured in pixels. The maximum is 1,000 px. If you have any pixel-filtering turned on, the maximum size drops to 200 px. In the Brush HUD, make sure "Detect Edges" is un-checked, and that "Brush Range" is "All".
-
How to save a very large number of samples?
I will need to sample 12 channels of signal with 10Mhz bandwidth simultaneously using two NI-PCI-5105 for about 60ms (12 x 3.6 million samples) at 60Msamples/s, and save it to file(s).
I'm currently testing the system with only one channel to take 6 million samples at 60Msamples/s. The project file is attached with this mesage. After I pushed the [Run Once] button, I can see the memory usage was climbing up for a couple minutes up to about 1.7GB, and then it stop with an error popup window: LabVIEW: Memory if full. VI "ExportLVMBlockState Time-WDT 3.vi" was stopped at node 0xEE8 of subVI "cmb_ex_CreateSignalChunkStringSub.vi".
The windows xp computer than I was using has about 3.4GB of physical memory.
Please advice me a solution for this problem.
Thank You,
Sofyan Tan
Attachments:
2008_03_26.zip 6 KBCurrently, SignalExpress will not handle this amount of data. You have two options, depending upon whether you want to repetitively do this task or do it single-shot.
You can use two instances of the NI-SCOPE Soft Front Panel if you have a way to simultaneously trigger both devices. A simple digital trigger will do it (wire the trigger to the digital input of both devices and set both to digital trigger). Setting both devices to the same analog trigger parameters may work - take the same channel in both devices, just don't fetch it from the second. Unfortunately, the Soft Front Panel does not support internal synchronization. Divide the channel count evenly between the two devices and the internal memory of the 5105 should be sufficient to take 60ms of data. Take one data set, then manually save the data to disk. I would recommend the HWS format or flat binary for this amount of data. With HWS, you can save both sets of data into the same file.
Use LabVIEW and NI-SCOPE to program it yourself. This is not particularly difficult, but does require use of some less commonly used functions. The NI-SCOPE examples should give you the information you need. The rough sequence of events is - synchronize the scope cards, take a set of data, fetch data in chunks from the scope cards and write to disk. If you take this route, I would recommend you read Managing Large Data Sets in LabVIEW.
You may wish to repost your question to the NI-SCOPE forum for more feedback. Let us know if you need more help.
This account is no longer active. Contact ShadesOfGray for current posts and information. -
hi all, maybe it's a stupid questions but when I open Analysis services processing task editor in ssis to process dimension, I just can not find the ProcessAdd in the Process Options List?
thanks
--Currently using Reporting Service 2000; Visual Studio .NET 2003; Visual Source Safe SSIS 2008 SSAS 2008, SVN --thanks sqlsaga, I did check but no luck ...
--Currently using Reporting Service 2000; Visual Studio .NET 2003; Visual Source Safe SSIS 2008 SSAS 2008, SVN --
This is what you should do for ProcessAdd option on a dimension
http://www.mssqltips.com/sqlservertip/2997/using-processadd-to-add-rows-to-a-dimension-in-sql-server-analysis-services-ssas/
http://www.purplefrogsystems.com/blog/2013/09/dimension-processadd-in-ssas/
Please Mark This As Answer if it helps to solve the issue Visakh ---------------------------- http://visakhm.blogspot.com/ https://www.facebook.com/VmBlogs -
Very large file upload 2 GB with Adobe Flash
Hello, anyone know how can I upload very large files with Adobe Flash or how to use SWFUpload?
Thanks in Advance, All help will be very appreciated.1. yes
2. I'm getting error message from php
if( $_FILES['Filedata']['error'] == 0 ){
if(move_uploaded_file( $_FILES['Filedata']['tmp_name'], $uploads_dir.$_FILES['Filedata']['name'] ) ){
echo 'ok';
exit();
echo 'error'; // Overhere
exit(); -
Very large files checkin to DMS/content server
Dear experts,
We have DMS with content server up and running.
However, there is a problem with very large files (up to 2GB!) - these files cannot be checked in due to extremely long upload times and maybe GUI limitations.
Does anyone have suggestions how we could get very large files into the content server ? I only want to check them in (either via DMS or directly to the content server) and then get the URL back.
best regards,
JohannesHi Johannes,
unfortunatly there is a limit for files regarding the Content Server which is about 2GB. Please note that such large files will cause very long upload times. If possible I would recommend you to split the files to smaller parts and try to check it in.
From my point of view it is not recommended to put files directly on the Content Server, because this could lead to inconsistencies on the Content Server.
Best regards,
Christoph -
How to export a table with near 1 billion records in 9i
Hi,
I would like to ask your inputs on exporting a very large table (rows approximately nearing 1 billion).
What are the parameters and its values to have an acceptable export time?
Thanks in Advanceuser13380363 wrote:
Hi,
I would like to ask your inputs on exporting a very large table (rows approximately nearing 1 billion).
What are the parameters and its values to have an acceptable export time?Just in case there might be an alternative - what are you trying to accomplish?
(In other words, is export the correct tool for your job?)
Note that it MIGHT be the right tool. However, based on experience, that is not a foregone conclusion until we understand what your end goal is intended to be. -
My pdf created in ID opens/views in Preview very large. How do I export so that the pdf views smaller?
melanie475689 wrote:
...in Preview, the file still opened at the same large size. Any other thoughts/suggestions?
Bob's post says it all. Preview ignores the PDF's Initial View setting, and beyond that, it's a horribly unreliable vehicle for accurate viewing of...just about anything. If you're engaged in professional design work, Preview has no legitimate place in your workflow. -
Every time we try and upload our persona, it shows up on screen very large. It seems like the dimensions are oversized and shows up on screen as only about 1/4 of the full design. Its like an extreme close up view of what what we built. We've built it to the specs provided on the site. If you want to check it out, the persona was created by Worth. Any suggestions on how to fix this?
- Inspect the dock connector on the iPod for bent or missing contacts, foreign material, corroded contacts, broken, missing or cracked plastic.
- Reset all settings
Go to Settings > General > Reset and tap Reset All Settings.
All your preferences and settings are reset. Information (such as contacts and calendars) and media (such as songs and videos) aren’t affected.
- Restore from backup. See:
iOS: How to back up
- Restore to factory settings/new iOS device.
If still problem, make an appointment at the Genius Bar of an Apple store since it appears you have a hardware problem.
Apple Retail Store - Genius Bar
Apple will exchange your iPod for a refurbished one for this price. They do not fix yours.
Apple - iPod Repair price -
In Mail on iMac, successfully running OS X Lion, one mailbox on My Mac for "Recovered Messages (from AOL)" keeps showing 1 very large message (more than 20 Mb) that I just cannot seem to delete. Each time I go into my In Box, the "loading" symbol spins and the message appears in the "Recovered Messages" mailbox. How can I get rid of this recurrent file, please?
At the same time, I'm not receviving any new mails in my In Box, although, if I look at the same account on my MacBook Pro, I can indeed see the incoming mails (but on that machine I do not have the "recovery" problem).
The help of a clear-thinking Apple fan would be greatly appreciated.
Many thanks.
From Ian in Paris, FranceIan
I worked it out.
Unhide your hidden files ( I used a widget from http://www.apple.com/downloads/dashboard/developer/hiddenfiles.html)
Go to your HD.
Go to Users.
Go to your House (home)
there should be a hidden Library folder there (it will be transparent)
Go to Mail in this folder
The next folder ( for me ) is V2
Click on that and the next one will be a whole list of your mail servers, and one folder called Mailboxes
Click on that and there should be a folder called recovered messages (server) . mbox
Click on that there a random numbered/lettered folder -> data
In that data folder is a list of random numbered folders (i.e a folder called 2, one called 9 etc) and in EACH of these, another numbered folder, and then a folder called messages.
In the messages folder delete all of the ebmx (I think that's what they were from memory, sorry I forgot as I already deleted my trash after my golden moment).
This was GOLDEN for me. Reason being, when I went to delete my "recovered file" in mail, it would give me an error message " cannot delete 2500 files". I knew it was only 1 file so this was weird. Why 2500 files? Because if you click on the ebmx files like I did, hey presto, it turned out that they were ALL THE SAME MESSAGE = 2500 times. In each of those folders in the random numbers, in their related message folder.
Now remember - DONT delete the folder, make sure you have gone to the message folder, found all those pesky ebmx files and deleted THOSE, not the folder.
It worked for me. No restarting or anything. And recovered file. GONE.
Started receiving and syncing mail again. Woohoo.
Best wishes. -
How can we suggest a new DBA OCE certification for very large databases?
How can we suggest a new DBA OCE certification for very large databases?
What web site, or what phone number can we call to suggest creating a VLDB OCE certification.
The largest databases that I have ever worked with barely over 1 Trillion Bytes.
Some people told me that the results of being a DBA totally change when you have a VERY LARGE DATABASE.
I could guess that maybe some of the following topics of how to configure might be on it,
* Partitioning
* parallel
* bigger block size - DSS vs OLTP
* etc
Where could I send in a recommendation?
Thanks RogerI wish there were some details about the OCE data warehousing.
Look at the topics for 1Z0-515. Assume that the 'lightweight' topics will go (like Best Practices) and that there will be more technical topics added.
Oracle Database 11g Data Warehousing Essentials | Oracle Certification Exam
Overview of Data Warehousing
Describe the benefits of a data warehouse
Describe the technical characteristics of a data warehouse
Describe the Oracle Database structures used primarily by a data warehouse
Explain the use of materialized views
Implement Database Resource Manager to control resource usage
Identify and explain the benefits provided by standard Oracle Database 11g enhancements for a data warehouse
Parallelism
Explain how the Oracle optimizer determines the degree of parallelism
Configure parallelism
Explain how parallelism and partitioning work together
Partitioning
Describe types of partitioning
Describe the benefits of partitioning
Implement partition-wise joins
Result Cache
Describe how the SQL Result Cache operates
Identify the scenarios which benefit the most from Result Set Caching
OLAP
Explain how Oracle OLAP delivers high performance
Describe how applications can access data stored in Oracle OLAP cubes
Advanced Compression
Explain the benefits provided by Advanced Compression
Explain how Advanced Compression operates
Describe how Advanced Compression interacts with other Oracle options and utilities
Data integration
Explain Oracle's overall approach to data integration
Describe the benefits provided by ODI
Differentiate the components of ODI
Create integration data flows with ODI
Ensure data quality with OWB
Explain the concept and use of real-time data integration
Describe the architecture of Oracle's data integration solutions
Data mining and analysis
Describe the components of Oracle's Data Mining option
Describe the analytical functions provided by Oracle Data Mining
Identify use cases that can benefit from Oracle Data Mining
Identify which Oracle products use Oracle Data Mining
Sizing
Properly size all resources to be used in a data warehouse configuration
Exadata
Describe the architecture of the Sun Oracle Database Machine
Describe configuration options for an Exadata Storage Server
Explain the advantages provided by the Exadata Storage Server
Best practices for performance
Employ best practices to load incremental data into a data warehouse
Employ best practices for using Oracle features to implement high performance data warehouses -
I am a scientist and run my own business. Money is tight. I have some very large Excel files (~200MB) that I need to sort and perform logic operations on. I currently use a MacBookPro (i7 core, 2.6GHz, 16GB 1600 MHz DDR3) and I am thinking about buying a multicore MacPro. Some of the operations take half an hour to perform. How much faster should I expect these operations to happen on a new MacPro? Is there a significant speed advantage in the 6 core vs 4 core? Practically speaking, what are the features I should look at and what is the speed bump I should expect if I go to 32GB or 64GB? Related to this I am using a 32 bit version of Excel. Is there a 64 bit spreadsheet that I can us on a Mac that has no limit on column and row size?
Grant Bennet-Alder,
It’s funny you mentioned using Activity Monitor. I use it all the time to watch when a computation cycle is finished so I can avoid a crash. I keep it up in the corner of my screen while I respond to email or work on a grant. Typically the %CPU will hang at ~100% (sometimes even saying the application is not responding in red) but will almost always complete the cycle if I let it go for 30 minutes or so. As long as I leave Excel alone while it is working it will not crash. I had not thought of using the Activity Monitor as you suggested. Also I did not realize using a 32 bit application limited me to 4GB of memory for each application. That is clearly a problem for this kind of work. Is there any work around for this? It seems like a 64-bit spreadsheet would help. I would love to use the new 64 bit Numbers but the current version limits the number of rows and columns. I tried it out on my MacBook Pro but my files don’t fit.
The hatter,
This may be the solution for me. I’m OK with assembling the unit you described (I’ve even etched my own boards) but feel very bad about needing to step away from Apple products. When I started computing this was the sort of thing computers were designed to do. Is there any native 64-bit spreadsheet that allows unlimited rows/columns, which will run on an Apple? Excel is only 64-bit on their machines.
Many thanks to both of you for your quick and on point answers!
Maybe you are looking for
-
I would also like to set it up so that if I unplug my external hard drive and take my laptop somewhere else, I can still have access to iTunes on said laptop. I realize that I wouldn't have access to the music that is on my external hd but I can sti
-
Problem in publishing through wizard
Hi I have installed 9i Lite in windos 2000. I did not find any difficult in connecting mobile server repository. When i am publishing samples using packging wizard, i am getting error that ETG-10106:Failed to logon to web-to-go. I am sure mobile serv
-
FDF & LiveCycle Designer - HELP!
Dear all, I think I'm becoming insane. I have forms created in LiveCycle Designer 8, and I need that one of the text fields will be filled in automatically. Its value (for example, "123456") is supposed to be based on a PHP Session. I've found this g
-
OSB 11g import sbconfig.jar error
Hello everybody, I'm trying to import OSB config jar to newly created OSB 11g environment. The following error occurs in Oracle Service Bus console during import: The import failed with exception: com.bea.wli.config.component.NotFoundException: Faile
-
I have 400 gb of free space. 32 gb of ram.Im using Win7 64bit SP1. Have downloaded to the default folder and "tried the solutions" on the adobe troubling shooting guide. There seems to be alot of users with the same issue. Please advise.