Incorrect data rates reported by Vimeo

I am getting this reported by Vimeo:
"This video's data rate is only 4236 kbit/s, which is lower than what we recommend and means your video might not look as nice as it should. For HD video, we suggest you use a higher data rate, typically 5000 kbit/s."
after uploading from Adobe's Media Encoder via Premiere Pro 2014.1 using the Vimeo 720 encoding preset.
I have always used this preset over the last couple of years but have started noticing this warning message appear in the last say six months.
Perhaps these presets beed looking at, or am I doing something wrong.
James
  Model Name: Mac Pro
  Model Identifier: MacPro6,1
  Processor Name: 8-Core Intel Xeon E5
  Processor Speed: 3 GHz
  Number of Processors: 1
  Total Number of Cores: 8
  L2 Cache (per Core): 256 KB
  L3 Cache: 25 MB
  Memory: 16 GB
  Boot ROM Version: MP61.0116.B04
  SMC Version (system): 2.20f18
  Illumination Version: 1.4a6
  Hardware UUID: 610B8801-5C80-5CEF-8892-EE0D95066882

Vimeo may have upped the allowable bit rate.  Increase the bit rate and make sure it's 2 pass and see if the quality is better.

Similar Messages

  • Report Server 2008 R2 incorrect Data in Report (Matrix)

    Hi,
    i try to build a report based on a stored procedure.
    Query returns data like:
    The Matrix should show data per EmplID and FOSGRP (Rows) and per BaseType Column.
    But the matrix shows data like this (eLearning and ILT are different Basetypes). Rows and Columns are correct but not the values, the are completly wrong, as you can see in the image above.
    Matrix shows data like this:
    The correct Totals should be eLearning = 11.01, ILT = 12. Where are the wrong Values come from?
    Total on FOSGRP.
    The matrix contains 2 rowgrouping 1. per Emplid, 2. per FOSGRP and 1 Column Group per BaseType (contains elearning/ILT).
    I tried out as well with a table object and it shows as well wrong numbers.
    Whats going wrong here? Wy is SSRS doing calculations on the detail data?
    The RawData returned from the query works fine in Excel Pivot, its doing the correct Totals
    Thank you for any idea and help
    Ivo

    Hi IvoSutter,
    According to your description, you have a store procedure to retrieve data from database. Now you find the data displayed in matrix was different from the records returned from query. Right?
    In this scenario, your report design was correct. In Reporting Services, if it only returns 4 rows in Query Builder, it supposed to show the same records in the report. Based on your second picture, it seems has other records for
    the EmployeeID so that the TotalCredit is not correct. Please try to execute the query in both Query Builder and SQL Server Management Studio. Check if they return same records. As we tested in our local environment, it works properly. Here are screenshots:
    If you can't get the same records in those two environment, please restart the Report Server in Reporting Services Configuration Manager.
    If you have any question, please feel free to ask.
    Best Regards,
    Simon Hou

  • Net due date incorrect in FBL3N report

    Hi,
    Net due date is coming correctly in Vendor Display report FBL1N, however the same entry when we see in reconciliation account in FBL3N we get the wrong due date. We are going thorough GL head to get the correct amount which is tallying with f.08 data for ageing group-wise. For this reason net due date is required.
    Let me know if there is any other way I can proceed.
    Regards,
    Devdatth

    Hi,
    Waiting for reply.
    I need to know, why in FBL1N we get the correct net due date and in FBl3N get incorrect net due date.
    For example:
    in FBL1N the entry is-
    doc date           Baseline date           Net due date             Pay terms
    08.10.2012         08.11.2012              08.12.2012                30 days
    the same entry in FBL3N is-
    doc date           Baseline date          Net due date              Pay terms 
    08.10.2012        08.11.2012              08.11.2012                 30 days  
    i.e in FBL3N report also the same date 08.12.2012 should appear. But the report is showing incorrect date.
    Regards,
    Devdatth

  • Incorrect date format in Crystal Report XI

    I have a problem with date in reports if Day in date (yyyy/mm/dd) less than 12.
    Problem is:
    I create a report (through application). If I check the instance of the report created on Crystal Server (in Central Management Console) it's correct. If I try to download (open or save) this report to workstation (through application) Day and month changed places and date become incorrect.
    Any attempts to play with regional settings in workstation did not help.
    What is the problem here and how I can resolve the issue?
    Any advices will be appreciated.

    Hi,
    Ok, here a link for an update maybe it helps
    [Click here|https://websmp230.sap-ag.de/sap(bD1lbiZjPTAwMQ==)/bc/bsp/spn/bobj_download/main.htm]
    Regards,
    Clint

  • Crystal report showing Incorrect data against Bex Query

    I have created Crystal report on top of Bex Query, I have not done any calculations in the report, just dragging structure and key figure in the report, ran for one month, when I compare with Bex Query with same period, it is showing incorrect data.
    I don't know what can be wrong here?
    We are Business Objects 3.1 SP3.
    Thanks,
    Ravi

    HI,
    I would suggest you trace the workflow in Web INtelligence according to these notes:
    Tracing:
    Unix : Note 1235111
    Windows : Note 1260004
    and then post a link for downloading the trace files here.
    regards
    Ingo

  • BOXIR2 FP4.6 -WebI report with merged dimensions: Incorrect data

    Issue: BOXIR2 FP 4.6 -WebI report with merged dimensions: Max value of an object from second query is displayed in the report even when the formula is based on the object from the first Query.
    Data displays properly on BOXIR2 SP2.
    Steps to reproduce.
    1.Use u2018Island Resort Marketingu2019 Universe to create a WebI report.
    2. Create a Query with objects u2018Reservation Dateu2019 and u2018Future guestsu2019 and apply a query filter where the u2018Reservation Dateu2019 is between 1/1/2001 and 31/1/2001 an Run the query.
    3. Add a second Query with the same objects i.e. u2018Reservation Dateu2019 and u2018Future guestsu2019 and apply a query filter where the u2018Reservation Dateu2019 is between 1/1/2001 and 31/12/2001.
    4. Create a report variable MaxDate using the syntax "=Max([Query 1].[Reservation Date]) In Report"
    5. Use the objects u2018Reservation Date Query 2u2019, u2018Future guests Query 2u2019 and u2018MaxDateu2019 variable to create a table in the report.
    You will find the Maxdate column displays Max date values of u2018Reservation Date Query 2u2019 instead of u2018Reservation Date Query 1u2019.
    Could you please advice me on this issue.

    Hi Hiteshwar,
    MaxDate variable returns a maximum of Reservation date  from Query 1 i.e. 1/15/01 which is correct. If you put that variable separately it will display that date correctly but you are grouping it along with objects from the query2 thatu2019s why it is displaying correct value for Query 2 where it finds 1/15/01, but for other values WEBI canu2019t group this value as it refers its formula which contains Query 1 reservation date.
    Hence we canu2019t say it is displaying values from Query 2 it is taking values from Query1 itself  for 1/15/2001 and for remaining values it is taking non matching values from Query 2 with which it is grouping.
    If you put that variable with objects from Query1 it wonu2019t display 1/15/01 but all values reservation date from query 1 contains because it is grouping based on reservation date though it is single max value.
    Even when you write "=Max(Query 2.Reservation Date) formula and use it with objects from query 1 though it contains so many values it matches (Groups) only the values from reservation date from Query 1.and displays only those values i.e. 1/1/2001 and 1/15/01.
    MaxDate returns 1/15/01 as Maximum Date value from Query 1 but Query 2 contains value far more greater than 1/15/01 hense that will be more incorrect data if WEBI displays maxdate as 1/15/01 along with Query 2 objects.
    Here we can avoid this behavior by Custom SQL but MaxDate is a report level variable hence we canu2019t edit SQL.
    If your data contains some duplicate values WEBI will repeat MaxDate value like
    Reservation date     Restaurant           Future_Guests    Total Guests 
    1/1/01                          ABC                         3                   3     
    1/15/01                       XYZ                          4                   8    
    1/16/01                       XYZ                          4                   8  
    By default, WEBI Groups Total Guests based on the Restaurant and Reservation Date, but if we want we can group those values based only on Restaurant which will duplicate Total Guest values for Restaurant XYZ. But you report contains only one dimension with all unique Values we canu2019t group those.
    I hope this is clear.
    Thanksu2026
    Pratik

  • (Please note) - Report - Bluetooth with Enhanced Data Rate Software II for Windows 7 wipe main drive

    All,
    There have been reports that when upgrading the Bluetooth with Enhanced Data Rate Software II for Windows 7 to version 6.4.0.1700 is causing a wipe on the main OS drive (commonly C:\)
    Version:
    6.4.0.1700
    Release Date:
    2011/05/16
    Affected units which uses this version of driver are as follows.
    Support models ThinkPad L420, L421
    ThinkPad L520
    ThinkPad T420, T420i, T420s, T420si
    ThinkPad T520, T520i
    Thinkpad W520
    ThinkPad X1
    ThinkPad X220, X220i, X220 Tablet, X220i Tablet
    ThinkPad Edge E220s
    ThinkPad Edge E420, E420s
    ThinkPad Edge E520
    Issue has been raised to engineering and the team are currently working on it.
    ****Please do not update the Bluetooth driver until further notice****
    We are in a process of pulling the driver off the Support Site and ThinkVantage System Update
    Main thread discussion here
    //JameZ
    Check out the Community Knowledge Base for hints and tips.
    Did someone help you today? Press the star on the left to thank them with a Kudo!
    If you find a post helpful and it answers your question, please mark it as an "Accepted Solution"!
    X240 | 8GB RAM | 512GB Samsung SSD
    Solved!
    Go to Solution.

    All,
    Updated drivers have been released to the web team and should be published in the next couple of days.
    Will update this thread once they are live - the official solution will be to use the newer drivers.
    mark
    ThinkPads: S30, T43, X60t, X1, W700ds, IdeaPad Y710, IdeaCentre: A300, IdeaPad K1
    Mark Hopkins
    Program Manager, Lenovo Social Media (Services)
    twitter @lenovoforums
    English Community   Deutsche Community   Comunidad en Español   Русскоязычное Сообщество

  • Max Data Rate: AGP 2x?!

    Bought myself a new MSI-KT6 Delta a few weeks ago (MSI-6590), and have had no problems until now.
    First up, my GeForce 4 Ti 4600 decided to suddenly die. Random characters on the screen in text mode, flickery colours and corrupt pixels everywhere. Windows wouldn't detect it properly anymore.
    Bought a new Powercolor Radeon 9600 XT with VIVO today. Downloaded the new drivers, and loaded up some Battlefield, all good. Runs nice and smooth. COD, Quake 3, etc, no problems at all.
    Halo, however... BAD. Even with the basic graphics settings and no Pixel-Shading, it runs at 4 frames a second, and nothing I do changes this, from lowest to highest with Pixel-Shaders 2.0. Although there's a visual difference, it's just as slow.
    So I loaded up ATI control Panel and used SmartGart to enable 8x and Fast-Writes... Rebooted, lots of flickering screen, and Windows XP is now software-rendering everything (Transparent menus take 100% CPU to fade in)
    Wait for SmartGart to fix my graphics again, and it's all back to 4x and no fast-write.
    Loaded up SiSoft Sandra, to check the settings in there...
    Mainboard: "Max Data Rate: AGP 2x"
    AGP Card: "Data Transfer Modes Support: 1x, 2x"
    "AGP Bus is unused or AGP card not fully AGP compatible"
    Er... should this be right? BIOS wont let me select an AGP speed, it's "Auto" and I can't select it. I'm a little concerned that my Motherboard may have been damaged! I'll go get the latest Halo Patch, but it doesn't change the fact that I can't use 8x or Fast-Write.
    I'm a little apprehensive about flashing my BIOS, made more difficult by the fact that I have no working floppy drive.
    CPU: AMD 2600+
    RAM: 2x512mb DDR 333MHz Kingston
    AlphaGremlin

    Well, I updated the VIA drivers, and an updated Sandra. AGP is running at 8x, and Fast-Writes are enabled! I don't know what the hell SmartGart was going on about, but it appears to slunk away and disappeared completely from the ATI control panel since the last reinstall of drivers.
    So there's one problem solved... but I don't know what's up with Halo... I can set it to the lowest settings and it still runs like an elephant with three legs, even with the -useff option for Fixed Function shaders... ah well, that's a matter for another forum, eh?
    Interesting note: Sandra reports my 120gb drive as 605gb... should I be trusting it?
    AlphaGremlin

  • VISA Read gets incorrect data from serial connection

    I am having difficulty using the VISA functions in LabVIEW to read data from a virtual COM port. Data is being sent from a serial to USB chip via a USB connection using OpenSDA drivers. I have a python program written to read from this chip as well, and it never has an issue. However, when trying to achieve the same read in LabVIEW I am running into the problem of getting incorrect data in the read buffer using the VISA Read function.
    I have a VISA Configure Serial Port function set up with a control to select the COM port that the device is plugged into. Baud rate is default at 9600. Termination char of my data is a newline, which is also default. Enable termination char is set to true. A VISA Open function follows this configuration, and then feeds the VISA Resource Name Out into a while loop where a VISA Read function displays the data in read buffer. Byte count for the VISA Read is set to 20 so I can read more of the erroneous datat, however actual data will only be 6-12 bytes. The while loop has a wait function, and no matter how much I slow down the readings I still get incorrect data (I have tried 20ms thru 1000ms). 
    The data I expect to receive in the read buffer from VISA Read is in the form of "0-255,0-255,0-255\n", like the following:
    51,93,31\n
    or
    51,193,128\n
    And occasionally I receive this data correctly, however I intermittently (sometimes every couple reads, sometimes a couple times in a row) get incorrect readings like this:
    51,1\n
    51,193739\n
    \n
    51,1933,191\n
    51,,193,196\n
    51,1933,252 
    51,203,116203,186\n
    Where it seems like the read data is truncated, missing characters, or has additional characters. Looking at these values, however, it seems like the read was done incorrectly because the bytes seem partially correct (51 is the first number even in incorrect reads).
    I have search but haven't found a similar issue and I am not sure what to try from here on. Any help is appreciated. Thanks!
    Attachments:
    Serial_Read_debugging.vi ‏13 KB

    The first thing is that none of the error clusters are connected so you could be getting errors that you are not seeing. Are you sure about the comm parameters? Finally, I have never had a lot of luck looking for termination characters. You might want to just capture data and append each read into one long string just to see if you are still seeing this strangeness.
    What sort of device is returning the data? How often does it spit out the data? How much distance is there between it and your computer? Can you configure it append something like a checksum or crc to the data?
    Mike...
    Certified Professional Instructor
    Certified LabVIEW Architect
    LabVIEW Champion
    "... after all, He's not a tame lion..."
    Be thinking ahead and mark your dance card for NI Week 2015 now: TS 6139 - Object Oriented First Steps

  • PGC has an error--data rate of this file is too high for DVD

    Getting one of those seemingly elusive PGC errors, though mine seems to be different from many of the ones listed here. Mine is telling me that the data rate of my file is too high for DVD. Only problem is, the file it's telling me has a datarate that is too high, is a slideshow which Encore has built using imported jpg files. I got the message, tried going into the slideshow and deleting the photo at the particular spot in the timeline where it said it had the problem, now getting the same message again with a different timecode spot in the same slideshow. The pictures are fairly big, but I assumed that Encore would automatically resize them to fit an NTSC DVD timeline. Do I need to open all the pictures in Photoshop and scale them down to 720x480 before I begin with the slideshows?

    With those efforts, regarding the RAM, it would *seem* that physical memory was not the problem.
    I'd look to how Windows is managing both the RAM addresses and also its Virtual Memory. To the former, I've seen programs/Processes that lock certain memory addresses upon launch (may be in startup), and do not report this to Windows accurately. Along those lines, you might want to use Task Manager to see what Processes are running from startup on your machine. I'll bet that you've got some that are not necessary, even if IT is doing a good job with the system setup. One can use MSCONFIG to do a trial of the system, without some of these.
    I also use a little program, EndItAll2 for eliminating all non-necessary programs and Processes, when doing editing. It's freeware, has a tiny footprint and usually does a perfect job of surveying your running programs and Processes, to shut them down. You can also modify its list, incase it wants to shut down something that IS necessary. I always Exit from my AV, spyware, popup-blocker, etc., as these progams will lie to EndItAll2 and say that they ARE necessary, as part of their job. Just close 'em out in the Tasktray, then run EndItAll2. Obviously, you'll need to do this with the approval of IT, but NLE machines need all available resources.
    Now, to the Virtual Memory. It is possible that Windows is not doing a good job of managing a dynamic Page File. Usually, it does, but many find there is greater stability with a fixed size at about 1.5 to 2.5x the physical RAM. I use the upper end with great results. A static Page File also makes defragmenting the HDD a bit easier too. I also have my Page File split over two physical HDD's. Some find locating to, say D:\ works best. For whatever reason, my XP-Pro SP3 demanded that I have it on C:\, or split between C:\ and D:\. Same OS on my 3 HDD laptop was cool having it on D:\ only. Go figure.
    These are just some thoughts.
    Glad that you got part of it solved and good luck with the next part. Since this seems to affect both PrPro and En, sounds system related.
    Hunt
    PS some IT techs love to add all sorts of monitors to the computers, especially if networkded. These are not usually bad, but are usually out of the mainstream, in that most users will never have most of these. You might want to ask about any monitors. Also, are you the only person with an NLE computer under the IT department? In major business offices, this often happens. Most IT folk do not have much, if any, experience with graphics, or NLE workstations. They spend their days servicing database, word processing and spreadsheet boxes.

  • Incorrect date getting determined

    Hi,
    I not sure what is happening. I written the below code. The begda and endda is between 11.06.2012 and 24.06.2012 but the begda for the particular pernr is 14.05.2012. gt_pa0008 has the correct values but its very strange gs_pa0008 is fetching an incorrect date but the other fields are correct
    PROVIDE FIELDS * FROM gt_pa0001 INTO gs_pa0001 VALID flag1
               BOUNDS begda AND endda
               FIELDS * FROM gt_pa0008 INTO gs_pa0008 VALID flag2
               BOUNDS begda AND endda
               FIELDS * FROM gt_pa0007 INTO gs_pa0007 VALID flag3
               BOUNDS  begda AND endda
        where begda <= lv_date AND endda >= wa_t549q-begda
         BETWEEN pn-begda AND pn-endda.
    Thank you.
    Regards,
    Narayani

    This relates to an incorrect FX Rate determination when a specific condition record is used.
    Under normal circumstances, this should not occur, but it looks like, if the ZHIV condition record is set up in EUR, rather than, (in this case) HRK, the values in the sales order / invoice and subsequently finance are converted back to front, (i.e 1 EUR = 7 HRK, rather than 7HRK = 1 EUR).

  • Data Rate connection query / SNR

    Hi All
    I have BT Infinity upto 40MB at my local cabinet.
    The BT checker says my line can get upto 24MB which I've always been happy with.
    I have always got around the 19MB mark but looking at my connection stats I am wondering if I should get more speed?
    My Data Rate for connection is always 19991 and SNR up is aroud 6 and SNR down is usually 15-20.
    You can see my current connection stats in the table below.
    I don't understand why the 'Attainable Rate' shows as '40867'.
    Am I missing something simple?
    Cheers for looking
    Craig
    VDSL
    Link Status
    Showtime
    Firmware Version
    1412f0
    VDSL2 Profile
    17a
    Basic Status
    Upstream
    Downstream
    Unit
    Actual Data Rate
    6799
    19991
    Kb/s
    SNR
    58
    179
    0.1dB
    Advance Status
    Upstream
    Downstream
    Unit
    Actual delay
    0
    0
    ms
    Actual INP
    0
    0
    0.1 symbols
    15M CV
    0
    0
    counter
    1Day CV
    16
    81
    counter
    15M FEC
    0
    0
    counter
    1Day FEC
    34
    636
    counter
    Total FEC
    283
    13238678
    counter
    Previous Data Rate
    6811
    19991
    Kbps
    Attainable Rate
    6799
    40867
    Kbps
    Electrical Length
    200
    200
    0.1 dB
    SNR Margin
    58
    N/A
    (US0,--) 0.1 dB
    SNR Margin
    59
    180
    (US1,DS1) 0.1 dB
    SNR Margin
    N/A
    179
    (US2,DS2) 0.1 dB
    SNR Margin
    N/A
    N/A
    (US3,DS3) 0.1 dB
    SNR Margin
    N/A
    N/A
    (US4,DS4) 0.1 dB
    15M Elapsed time
    20
    20
    secs
    15M FECS
    0
    0
    counter
    15M ES
    0
    0
    counter
    15M SES
    0
    0
    counter
    15M LOSS
    0
    0
    counter
    15M UAS
    0
    0
    counter
    1Day Elapsed time
    6321
    6321
    secs
    1Day FECS
    4
    71
    counter
    1Day ES
    7
    56
    counter
    1Day SES
    0
    0
    counter
    1Day LOSS
    0
    0
    counter
    1Day UAS
    76
    76
    counter
    Total FECS
    149
    135577
    counter
    Total ES
    7917
    38096
    counter
    Total SES
    13
    76
    counter
    Total LOSS
    0
    10
    counter
    Total UAS
    180
    625
    counter
    Solved!
    Go to Solution.

    You can test your line here https://www.bt.com/consumerFaultTracking/public/faults/reporting.do?pageId=21
    Is there any noise on the phone?
    If you have a line fault, you need an engineer to fix it.
    Once the instability is resolved, DLM will automatically raise the speed so your SNR Margin goes to 6 when you resync.
    If you found this post helpful, please click on the star on the left
    If not, I'll try again

  • FYI DU error:"Reserved fields in the catalog record have incorrect data"

    I had a very perplexing problem...
    My iMac G5 was shutting down by itself so I took it to Apple and had the power supply replaced. After it was repaired I was concerned about the hard drive health after the unexpected shutdowns. So I ran Disk Utility and encountered the following...
    The iMac G5 is running OS 10.4.6.
    I ran "Repair Disk Permissions" from Disk Utility and encountered no errors.
    I ran "Verify Disk" from Disk Utility on the drive and I get these results (copied from DiskUtility.log):
    Verifying volume “TheDisk”
    Checking HFS Plus volume.
    Checking Extents Overflow file.
    Checking Catalog file.
    Checking multi-linked files.
    Checking Catalog hierarchy.
    Checking Extended Attributes file.
    Reserved fields in the catalog record have incorrect data
    Checking volume bitmap.
    Checking volume information.
    The volume TheDisk needs to be repaired.
    Error: The underlying task reported failure on exit
    Disk Utility stopped verifying “TheDisk” because the following error was encountered:
    The underlying task reported failure on exit
    1 HFS volume checked
    Volume needs repair
    This occurs whether I run it while booted from the hard drive or the restore disc. While booted from the restore disc, running repair disc indicates that the failure could not be fixed.
    Booted from the hardware test disc and all tests passed.
    Booted from TechTool Pro disc and all the tests passed.
    Booted from Disk Warrior (had to get the "free" update to boot my iMac G5) and rebuilt directory. No errors reported.
    Still same error reported in Disk Utility. ???
    Booted iMac in 'safe' mode. Rebooted iMac normally.
    Disk Utility now reports success!

    Sorry but this is not assistance to you but an additional request for help from someone!!!
    I had the same thing occur just now!!!! I have used everything in site and keeps returning............
    Hope someone has an answer as to why this occured....

  • Encoded video does not have expected data rate

    Hi
    I've been encoding with Flash Media Encoder (CS3) recently
    and come across a contradiction which I can't seem to make sense
    of. Can it be possible that the longer the video, the harder it is
    for the encoder to maintain data rate?
    This was revealed when I started using FMS2 (space on a
    shared server as opposed to my own server). I encoded a video at
    what I thought was a data rate which was within my data rate limit
    for my stream and it stuttered on playback over the stream. The FMS
    supplier informed me that the FLV I thought I'd encoded at 400kbps
    video 96 kbps audio actually had a data rate of around 700kbps in
    total. But if I open the video in a FLV player, and look at info,
    it still shows the setting I originally intended. However, my
    supplier suggested checking data rate using this equation:
    (filesize in MB x 8)/videoduration in seconds = video
    datarate in kbits/sec
    If you apply this calculation to my video it does indeed show
    what they said:
    Using Flash Media Encoder:
    Encoded at 400kbps video - 96 kbps audio
    Compressed filesize = 26,517 kbytes
    Duration = 240 seconds
    CALCULATION GIVES DATE RATE OF 880k/b a second
    (note file contains an alpha channel)
    Furthermore, applied to a shorter video, we get the following
    more logical results:
    Encoded at 400kbps video - 96 kbps audio
    Compressed filesize = 6,076 kbytes
    Duration = 90 seconds
    CALCULATION GIVES DATA RATE OF 527 kb/sec
    (no alpha channel)
    The only differences between the two vids is the alpha
    channel and the duration. Somethings doesn't add up here.
    Any clues?
    Thanks
    Simon

    I've just partly answered my own question. Funny how writing
    a post can help you think your problem through sometimes.
    The issue is the alpha channel, which adds to the file size
    thereby making the data rate "incorrect". In other words, the same
    file encoded without the alpha channel provides the expected
    results. I guess this changes my question to: How do I create
    predicatable data rates when encoding using an alpha channel. I
    could just estimate of course and see what comes out, but is there
    a better way?
    SImon.

  • Missing: Bluetooth with Enhanced Data Rate for vista

    I was looking for the "Bluetooth with Enhanced Data Rate Software II" for vista
    After I found the right page (http://www-307.ibm.com/pc/support/site.wss/documen​t.do?lndocid=MIGR-67250), it seem their is no link to the setup file
    Any ideas?

    glev55's link to the x6x Vista drivers is also fixed now.
    Thanks to whoever reported it through Site Feedback!
    I don't work for Lenovo. I'm a crazy volunteer!

Maybe you are looking for