Data Conversion and Derived Column issues

I have a strange issue occurring with one of my SSIS packages. 
What needs to happen:
I have to read data from a table that stores a field in NCHAR(40)
Send it through a Data Cleansing SSIS Component that forcefully outputs the data at a cleansed state as DT_STR(50)
Update the same source table with cleansed data - Using an UPSERT third party tool
Of course, I can't update the tables that stores data in NCHAR(40) with data from DT_STR(50), so I'm trying to use the Data Conversion Component, the Derived Column Component or a combination of, to set the data to DT_WSTR as well as to set the correct length,
from 50 to 40.
The Data Conversion Component fails when I try to set the incoming data (DT_STR(50)) to DT_WSTR(40):
[Data Conversion [186]] Error: Data conversion failed while converting column "MD_Address1" (97) to column "_MD_Address1" (190).  The conversion returned status value 2 and status text
"The value could not be converted because of a potential loss of data.".
Now I tried the same thing with the Derived Column Component, converting the data from the DQS component from (DT_STR(50)) to DT_WSTR(40) and there's the error message:
[Derived Column [196]] Error: The "Derived Column" failed because truncation occurred, and the truncation row disposition on "Derived Column.Inputs[Derived Column Input].Columns[_MD_Address1]"
specifies failure on truncation. A truncation error occurred on the specified object of the specified component.
I also tried a combination of Data Conversion (From DT_STR(50) to DT_WSTR(50)) and a Derived Column just casting to the correct size, but that didn't work either. 
Every time I try to change the size, one of the components fail. 
It appears that my issue has nothing to do with the data types, but the actual data length.
Why can't SSIS CAST my input from 50 to 40?
What do I need to do to cast/convert data from DT_STR(50) to DT_WSTR(40)?

Hi IgorSantos_FL,
It is the expected behavior when we try to convert DT_STR(50) (means maximum 50 characters in the value) to DT_WSTR(40) (means maximum 40 characters in the value). However, the truncation issue should not occur if you convert DT_STR(50) to
DT_WSTR(50). Could you post the error message that you received when converting DT_STR(50) to DT_WSTR(50)? It may be a different issue.
Regards,
Mike Yin
TechNet Community Support

Similar Messages

  • Best way to derive a "week ending" date using the Derived Column Transformations

    Hi, I have an interesting challenge. I am working on creating a BI DB that contains timesheet data. The data contains a column representing the date "worked"  ([Date]. Nearly all output reporting is based on a timesheeting week that end on
    a Wednesday. My thinking has been to create a derived column "WE" (week ending) that represents the entries of the preceding 6 days.
    (Originally I entertained deriving this value view SQL view - however we are talking about a DB that is is a substantial DB (excess of 100M timesheet bookings) and an index on the WE field is warranted) so decided a derived WE column was best approach.
    The Date field is represented from a SAP format (German long dated format) - however I cannot use the convert option ;( in the TE.
    The Date field is derived via: (DT_DATE)(SUBSTRING([Date      ],7,4) + "-" + SUBSTRING([Date      ],4,2) + "-" + SUBSTRING([Date      ],1,2))
    I would welcome some recommendation on how to best derive a WE column. (The DT_DATE format is not susceptible to a /7, mod7 operation).
    Thanks in advance,
    /L

    Try this solution :
    http://stackoverflow.com/questions/1803987/how-do-i-exclude-weekend-days-in-a-sql-server-query

  • Planning and Scoping- Data conversion and testing

    hello Gurus
    Good people i'm looking for your help in these areas.....
    1) Planning and scoping for data conversion....we will be in the process of acquiring information from the client soon. So, what sort of information or questions do i need to get from the client and also the plan needed?
    2) Test planning...... things i need to get from the client to prepare for this plan???
    thanks
    rasham

    Hi,
    the question which ou have asked is really very broad. so let me give some idea regarding the planning of data conversion. first of all you should identify which all objects you want to be transfered to SAP, obviously In MM material master, inforecord source list e.tc. then you need to decide which SAP tool you will be using to upload these data like LSMW, CATT e.t.c.
    once it is decided what and how you are going to transfer then comes the Functional spec for data conversion in which you will give the details of data mapping in SAP. these FS will be translated to Tech Spec by technical consultant.
    Test Planning,
    first of all you will decide what type of testing you want to cary out like String test, Unit test, and integration test but these are decided by the testing team not Func consultant but anyway once it is decide how many level of testing is carried out then as a func consultant you will have to create test scenario for integration testing,
    Regards
    Vikrama

  • Report Attributes: Add column link and derived column

    On the Report Attributes page (4000:420), under the Tasks list on the right side of the page, there are 2 links, "Add column link" and "Add derived column". This is the first time I am noticing them.
    The usage is clear, they add columns to the report to function either as a drill-thru link column or just a read-only derived field.
    Doing it this way does NOT modify the source query for the report region.
    Is there any benefit to using this feature versus adding a "null column_link" or "null derived_col" columns to my source query then setting attributes for those columns?
    Thanks

    Well, one disadvantage I can see of using the 'Add Column link' is that if I have more than 1 of such columns, they all show up as '[column link]' on the Report Attributes page with no way to tell what the underlying links are. Instead, if I were to manually add these dummy columns to my query itself, I would give them meaningful column aliases that indicated where they link to.
    Suggestion: Maybe a tooltip could be added to the Report Attributes page for such columns which indicate what the underlying link is without actually going to the Edit page for that column.
    Thanks

  • File Content Conversion and Empty file issue

    Hello,
    The issue is :
    I have configured a file sender adapter with file Content Conversion. I have 2 kinds of records Header and Items.
    i have defined the 'Key Field Name' based on the first character of each line of my flat file and the values associated :
    Header.keyFieldValue = 1
    Item.keyFieldValue = 2
    Flat file example :
    1;Headerfield1;Headerfield2
    2;ItemField1;ItemField1;ItemField1;ItemField1;
    2;ItemField1;ItemField1;ItemField1;ItemField1;
    2;ItemField1;ItemField1;ItemField1;ItemField1;
    and everything works fine !
    But now imagine you receive a bad file like this :
    xxxxx;ohhohoh;llllll
    y;sdfsdfs;zezerz;zerzer
    e;zerzerze;zezerzerzer
    The result is : IS receive nothing and no alert is generated ?!!!???
    What i have seen is that the adapter doesn't find any corresponding value for keyFieldValue so it consider that the flat file is empty and i do nothing, the file is archived and that's all no alert is generated ????
    But i want to receive an alert for checking that the processed flat file was not correct !
    If anybody has an idea, it will be great !
    Regards,
    Vincent

    Hi Vincent.
    Sometimes it really #!@#%%#.. me off when customers excepts that XI will solve their whole organization interfacing problems (and the world hunger as well...:)...
    even when it comes to the responsible systems that creates the file(sometimes with bugs and problems even before XI came to the org.).as in every developed application or out of the box one, the application has to take care of its outputs and deal with errors. when it cant deliver what other systems expects her to  it has to inform some one...
    When it comes to the Adapter frame work XI expects the system in its landscape to be responsible for the data they send (well formed and with the defined  structure...IDocs,XML,flat files).
    as mentioned in my colleagues previous postings you can take care of data verification during mapping and so..
    I belive it is possible to monitor the incoming file before it is parsed to XML (not sure it is the right way to) and maybe we'll get a solution to that in the future (today we can monitor the comm. channel wether its  configures well or not,maybe it will be possible in the future to alert when an incoming file is empty)
    Regards.
    Nimrod

  • Need help in solving conversion row to columns issue

    Hi friends
    I came across a strange situation/ problem. The problem is in one table say source_metadata
    I have 4 attribute
    matching_Table matching_columns source_table source_col
    OTREDW.PARTY_ALT_ID Src_key rcw Source_key
    OTREDW.PARTY_ALT_ID party_id rcw party_id_org
    otredw.individual SRC_KEY
    otredw.individual name rcw name_org
    otredw.individual name rcw name_mod
    otredw.wage_fact src_key
    otredw.wage_fact wages_tips rcw wage_tips_org
    otredw.wage_fact wages_tips rcw wage_tips_mod
    The matching tables and source table can be matched on src_key/source_key
    I need to fetch the values from the respective tables / columns in one query. I have just faint idea to the solution is use of PIVOT feature.
    The desired OUTPUT is
    PARTY_ID PARTY_ID_ORG Name name_org name_mod wages_tips wages_tips_org wages_tips_mod
    1111111 1111111 James James JamesR 1000 1000 2000
    I hope I have explained what I need. I will appreciate If anyone will be able to show me the way how to resolve this issue.
    Thanks
    Rajesh

    Thanks everone for sending me the answer but I think I am unable to explain my issue.
    The main issue is I am storing the name of the table and name of the corresponding column of the table under different columns of the table. How to fetch the data using a sql from these columns.
    EXAMPLE
    METADATA_TABLE
    table_name Column_name
    OTREDW.PARTY_ALT_ID PARTY_ID
    OTREDW.INDIVIDUAL NAME
    OTREDW.WAGE_FACT WAGES_TIPS
    How to write query to fetch the table_name and column_name from metadata_table and fetch the data from the respective tables mentioned in the table columns?
    I hope I was able to explain the question now.
    Thanks
    Rajesh

  • FYI: date.timezine and mysql.sock issues resolved!

    For anyone who might run into the same issues with your website(s) like I did. Here are a couple things you may want to look into if you are having these issues.
    The cause of these issues is due to snow leopard server using PHP 5.3.
    *_ISSUE #1_*: PHP throwing the following errors:
    +Warning: strtotime() [function.strtotime]: It is not safe to rely on the system's timezone settings. You are required to use the date.timezone setting or the datedefault_timezoneset() function. In case you used any of those methods and you are still getting this warning, you most likely misspelled the timezone identifier. We selected 'Europe/Helsinki' for 'EEST/3.0/DST' instead in /path/to/my/www/wp-includes/functions.php on line 35+
    +Warning: date() [function.date]: It is not safe to rely on the system's timezone settings. You are required to use the date.timezone setting or the datedefault_timezoneset() function. In case you used any of those methods and you are still getting this warning, you most likely misspelled the timezone identifier. We selected 'Europe/Helsinki' for 'EEST/3.0/DST' instead in /path/to/my/www/wp-includes/functions.php on line 43+
    +Warning: strtotime() [function.strtotime]: It is not safe to rely on the system's timezone settings. You are required to use the date.timezone setting or the datedefault_timezoneset() function. In case you used any of those methods and you are still getting this warning, you most likely misspelled the timezone identifier. We selected 'Europe/Helsinki' for 'EEST/3.0/DST' instead in /path/to/my/www/wp-includes/functions.php on line 35+
    +Warning: date() [function.date]: It is not safe to rely on the system's timezone settings. You are required to use the date.timezone setting or the datedefault_timezoneset() function. In case you used any of those methods and you are still getting this warning, you most likely misspelled the timezone identifier. We selected 'Europe/Helsinki' for 'EEST/3.0/DST' instead in /path/to/my/www/wp-includes/functions.php on line 43+
    *_FIX FOR ISSUE #1:_* Set "date.timezone = "YOURTIMEZONE"" value in your /etc/php.ini. Here is the list of timezones http://nl3.php.net/manual/en/timezones.america.php. Then restart web service in server admin.
    *_ISSUE #2_:* PHP throwing error "+Warning: mysql_connect() [function.mysql-connect]: [2002] No such file or directory (trying to connect via unix:///tmp/mysql.sock)+"
    *_FIX FOR ISSUE #2:_* Set "mysql.default_socket" value in your /etc/php.ini to "mysql.default_socket = /var/mysql/mysql.sock". Then restart web service in server admin

    Didn't resolve it for me using 10.6.1 server. I tried adding each of the following in php.ini (not all at the same time, one by one...). Restarted web service after each change (even though you don't really need to).
    [Date]
    ; Defines the default timezone used by the date functions
    ; http://php.net/date.timezone
    date.timezone = 'America/Los_Angeles'
    date.timezone = 'PDT'
    datedefault_timezone_set('America/LosAngeles')
    datedefault_timezone_set("America/LosAngeles")
    Same error in the logs with any of these enabled:
    [Sat Sep 19 14:33:32 2009] [error] [client xxx.xxx.xxx.xx] PHP Warning: date(): It is not safe to rely on the system's timezone settings. You are required to use the date.timezone setting or the datedefault_timezoneset() function. In case you used any of those methods and you are still getting this warning, you most likely misspelled the timezone identifier. We selected 'America/Los_Angeles' for 'PDT/-7.0/DST' instead in /usr/share/squirrelmail/functions/date.php on line 289, referer: https://icecrown.hri.uci.edu/webmail/src/webmail.php
    I'm not running any PHP other than webmail anyway and it seems to be working fine (Besides not being able to get vacation replies working) but I haven't done a lot of sending / receiving since this is a test box.

  • ACS with patch L 6 and Name column issue

    Has anyone experienced the following?
    My customer has used the migrate tool to migrate users from the ACS 4.2 to 5.3. He has also applied the patch level 6. However under the Identity Groups listed names (the Name column)- from some - to half of the name is missing [e.g lets say the name contains the following information: “Dimension Data”], after migrating only “Dimensi” to be seen.  He then removed the Patch Level 6 and reapplied with no success. Any advice or do I need to run to the TAC ••J
    Thanks a lot
    Lancellot Wendel

    Hi Tarik,
    thanks for the reply,
    with reg to the question
    "If you remove patch 6 and then migrate, does it work?"
    No it did not work either, well I guess I have to open a TAC case for this.
    thanks in advnace
    regards,
    lancellot

  • Select Data Source and Microsoft Security Issue

    Hi,
    Tool- Xcelsius 2008, QAaWS
    When I open dashboard, it gives message "Microsoft Office has identified a potential security concern" "Data Connection have been blocked. If you choose to enable data connection, your computer may no longer be secure. Do not enable this content unless you trust the source of this file." with <Enable> and <Discable> buttons.
    If it Enabled then leads to "Select Data Source" screen and asks details for DSN.
    At every open it shows same messages.
    Please, help if anyone knows or faced this issue.
    Regards,
    Ashish

    hi,
    this is a really old post.
    please could you specify your exact workflow ?
    what connectors your dashboard is using?
    also, what version and SP and patch are you using for Xcelsius client?
    i.e. Are you up to date with latest compatibility updates?
    regards,
    H

  • SQL Loader: handling difference datatypes in data file and table column

    Hi,
    I am not sure if my question is valid but I am having this doubt.
    I am trying to load data from my data file into a table with just a single column of FLOAT datatype using SQL Loader. But very few insertions take place, leaving a large number of record rejected for the same reason-
    Record 7: Rejected - Error on table T1, column MSISDN.
    ORA-01722: invalid number
    The data in my datafile goes like this: (with a single space before every field)
       233207332711<EOFD><EORD>    233208660745<EOFD><EORD>    233200767380<EOFD><EORD>
    Here I want to know if there is any way to type cast the data read from the data file to suit my table column's datatype.
    How do I handle this? I want to load all the data from my datafile into my table.

    Pl continue the discussion in your original post - Pls help: SQL Loader loads only one record

  • Since mountain lion Find in Numbers 3-4 columns of data missing and some columns in the wrong order. Fortunately I have a back up on my iPad and was able to put that version in as a new file. Why has this happened ?

    Since mountain lion I have noticed 3-4 columns of data missing from a Numbers file. I have fortunately been able to replace the total file from my iPad version but why has this happened. Makes me lose trust in Numbers

    That would be strange.

  • Aironet 1200 LWAPP Conversion and WLC Authorization Issues

    Please excuse the length of this.
    I am trying to convert several Aironet 1200 series APs to the LWAPP Recovery image and register them with a WLC. There is no WCS involved.
    All of the 1200s have b/g cards in them, specifically MP21G and MP31G, so per the "Upgrading Autonomous Cisco Aironet Access Points to Lightweight Mode" guide, I am good. The 1200s are older and obviously do not have the manufacturer certificate.
    The WLC is running the latest code, AIR-WLC4400-K9-3-2-116-21.aes.
    The APs are running 12.3(7)JA2 or 12.3(7)JA3, all of which per the upgrade document and the Upgrade Tool utility meet the minimum requirements being 12.(7)JA or greater.
    I will first dive into the disaster of the conversion process.
    First, I upgraded a 1200 with a MP21G radio module. As far as the Upgrade Tool, I specified everything mandatory (AP credential file, recovery image, time source) while leaving the WLC and DNS server information absent. It successfully loaded the image and created all the necessary keys. Soon after I upgraded the image, the AP started rebooting continuously. I downgraded the AP to IOS, and found that the AP no longer has visibility of it’s radio module. It is simply gone. I upgraded and downgraded the code on the AP, hoping in the process that it would rediscover the radio module but it never did. I have not taken it apart yet to see if there is anything physically wrong and I will, but it was working fine previously, so even though it is possible that the radio module has checked out, my mind just can’t go there. Scratch one.
    So, AP number two has the same hardware. After attempting to convert it seven times with complete failure, on the eighth it successfully converted. I changed nothing is the upgrade process. Cool. So, with this success, I wanted to add it to WLC.
    Network overview: the AP and WLC (Management and AP-Manager) are in the same broadcast domain.
    Ok, the WLC configuration. The WLC is in Layer 3 mode which per the documentation is the only supported configuration that the converted Aironet APs will support. The Management and AP-Manager interfaces have been defined and are up and pingable by the converted AP. The AP information (AP MAC address, SSC, SHA key hash) has been entered in the WLC AP Authorization list as well. SSC certificates have been allowed.
    As far as DHCP options 43 and 60, none have been defined. Per the documentation, even if the WLC Controller is in Layer 3 mode, as long as the AP is in the same broadcast domain, the AP should do a LWAPP discovery and make a join request to the controller. Again, the AP can ping both the Managment and AP-Manager interfaces.
    So far, the AP has not been able to join the WLC. There is nothing in the logs. I have failed to do a debug on the WLC to see what is going on, but one thing I have seen is on the AP debug, a “AP-Manager not enough interfaces” message.
    All the documentation I have been referring to is the "Upgrading Autonomous Cisco Aironet Access Points to Lightweight Mode" guide.
    I will put the AP on a different segment tomorrow and try the DHCP route.
    It is probably something simple.
    Any ideas? :)

    Jason
    From your post I belive you've done everything the right way...excetp: Have you activated Master Controller Mode? If you haven't, no AP is ever gonna be able to register with a controller. You need to activate it temporarily when you want to add new (converted only?) radios. Once the AP is visible on the master controller (in the Wireless tab), define the primary / secondary / tertiary controller for it. If you're finished with registering any new radios, disable it again.
    You can find it under Controller -> Master Controller Mode
    And yes...always keep the AP you want to convert in the same subnet as the management int (so you don't have to use DHCP and fumble around with the option 43 setting to find the controller). To help the Upgrade Tool work troublefree: Enable Telnet mode on the controller. Also, don't forget to enter the mgmt IP / user name / password in the Upgrade Tool. At least with this I was able to complete my upgrading and registering process successfully in my own environment.
    Hope this helps...
    Toni

  • Excel data source and the use of add command

    HI, Looking for suggestions on how to work with multiple inputs that cannot be joined directly.  Here's the background.
    The Report currently reads in two different Ecel files and uses 3 SQL commands to query an Oracle Database.  I need to join the 5 data sources and am having issues with the 2 excel files.  In one file I need to be able to derive a field based on another column in the file In order to create the join condition to the SQL commands.  I'd equate this to a case statement in SQL, but how goes one do that using the 'add command' feature?  What is the syntax?
    Next I would need to join (left outer) the two excel files using two fields from file A (a1, a2) and 3 fields from file B (b1,b2,b3), where a1=b1 and b2 <= a2  <= b3 when rows from A exist in B.  If row A does not exist in B then we still want it in the report and available to left outer join to the 3 oractab data sources.
    runtime is also a concern.
    Any Suggestions?

    hi Elena,
    in this case the use of subreports is not recommended. that's because you're exporting to excel and you need data in columns across the report. subreports will not, unfortunately, give you what you need.
    this would bring you back to joining the datasources. what i would recommend is looking into using 'oracle database link' to link your oracle db to excel files. here's one article as an example but you may be able to find a better one. if you have questions on this please ask them on an oracle forum as the syntax that you need will be database specific.
    a lot of databases have this type of technology which allows you to create a view to other data. sql server has 'linked servers', sap hana has 'smart data access'. essentially you are creating a non-materialized view to the external data. then this view is available on the main oracle server where you established this connection. this should be a lot easier than trying to bring a bunch of command objects together off independent datasources inside of crystal.
    -jamie

  • The data type of "output column "Cookies" (7371)" does not match the data type "System.Int32" of the source column "Cookies"

    I have the following data flow:
    ADO Source
    Input Column: Cookies       ID: 7371      Datatype: DT_NUMERIC
    Output Column: Cookies     ID: 7391      Datatype: DT_NUMERIC
    DATA CONVERSION TASK
    Input Column: Cookies       ID: 1311     Datatype: DT_NUMERIC
    Output Column: Copy of Cookies   ID: 1444    Datatype: DT_I4
    SQL Server Destination
    Input Column: Copy of Cookies    ID: 8733    Datatype: DT_I4
    Output Column: Cookies       ID: 8323     Datatype: DT_I4
    This looks fine to me. Why am I getting the error?
    This is SQL Server 2008 and I am working at this point only in BIDS, so it is not a question of dev vs prod server or anything. One environment, running in BIDS. Other similar data flows seems to be working ok. It seems the error is referring to a datatype
    mismatch in the source--but how can that be?
    Thanks!

    Actually, I am wrong in that Visakh. I think you are correct.
    There are two versions of all tables, one with 15 minute rollups and one with daily rollups. Otherwise the tables are the same--same exact fields. I use a loop with a data flow inside to get data from one version of the rollup and then the other.
    The problem is, for some of the fields with larger values the datatype is NUMERIC instead of INTEGER. (This is Cache database). SO:
    dailyCookies:   Field: CountOne   Datatype:  NUMERIC
    15minCookies:   Field: CountOne   Datatype: INTEGER
    A variable dynamically creates the query so it appends "daily" or "15min" to the beginning of the tables. So on the first loop it does the 15min tables and on the second the daily tables. When I created this particular table I have to plug a default query
    in so I used the daily. It picked up the datatype as NUMERIC so when I run the package it loops through 15min first and sees the datatype is INTEGER.
    How to deal with this? I suppose I could convert the datatype in the source query, but that would be a hassle to do that for some fields and not others. This tables has hundreds of fields, BTW. Can one source object deal with a change of datatypes? SSIS
    is so picky on datatypes compared to other tools....
    Thanks,

  • Wrongly Activated Extended Withholding tax before data conversion

    Hi all,
    In 1099 Reporting for US, I have activated Extended Withholding Tax before conversion of data from Classical to Extended Withholding Tax. I have done this in Development system with transport request.
    If anybody has come across similar situation, pl help me with your valuable inputs.
    1. Will I be able to deactivate the extended withholding tax and do the data conversion and activate again.
    2. If not, what will be the consequences of not doing the data conversion of existing data.
    Kindly help me to come out of this crisis.
    Thanks in advance.
    San.

    Hi,
    Yes it will be possible.
    I haven't really faced this situation in EWT, but in another similar migration.
    I do not know the extent of customisation you did for EWT or whether the documents you posted after activating did get posted in the EWT area.
    The only place I think you may face will be in the migration itself, where these documents may give you errors is in the actual data migration step.  But there would be some solution for that.
    You need to see what options you have.  Can you let this situation continue?  Was this migration not a prep for doing the migration in production or just a test?  If not then you will need to do something.  This should have no impact on any migration you plan for the other systems (Quality/Production) in the landscape.
    You can log in an OSS message just in case.
    Cheers...

Maybe you are looking for

  • Custom pushbutton in ME21N should display a popup window with item details

    Hello,, The requirement is to 1. Add a custom pushbutton in ME21N screen at header level. 2. The user will select some PO line items and will click on this push button.     This inturn should trigger a popup window with item details only for those se

  • Unable to create a book in iPhoto '08 version 7.1.4

    When I select any album and click on the +Book Icon+ at the bottom of the screen the Book Type window comes up but all of the options are grayed out and I am unable to select anything. If I just click the Choose button, a blank layout comes up but it

  • Why can't I convert my PDF to a Word Document?

    I've purchased the necessary/recommended software at 19.99 in order to convert this PDF document into a word document.  The document is large--53.3MB, and yet the stated "limit is 100MB, so it should work.  All of the computer software matches the "t

  • BPS Help with user exit.----- Quick one for Marc.B?

    Hi All,        I'm using an user exit to prefix the GL/account with '2' to make it a stat account.In the fields to be changed , i have included all the characteristics like gl account,region,line of business...key fig name..etc.However when i run the

  • Importing from iPhoto to PC formatted iPod?

    I have a 5G, 60gig, formerly Mac formatted, now PC formatted iPod. I keep my photos, music, contacts and calenders on the Mac. (I am full time Mac user). I only have USB 1.1 on my PB G4 so uploading video to the iPod is out of the question. This is w