CS3 PNG reading/writing failing on very large network volumes

We are experiencing an issue with the built-in PNG file format plugin, that is consistent across multiple versions of OS X on fully-patched CS3 Design Premium. I'm hoping to hear that others have seem something similar.
We recently deployed a 1.8TB network drive via Windows Server 2003. Since then we cannot read or write PNG files, not even a preview through the file dialog, without hanging the CS3 application interminably. We have done our testing in Photoshop and Illustrator. The same occurs opening a file directly from the Finder to these apps. We are connecting to the server via SMB. All other file operations function properly with every other format. Working with PNGs outside of the Creative Suite works fine.
It is definitely isolated to PNG. We have tested exactly the same file on a sub-terabyte volume on exactly the same file server and the PNG files opened and saved as expected. It is only on the large terabyte+ volume that it exhibits these issues.
There is no recovering from the error. The application beachballs and never returns to activity, no matter how long one waits.
We have tried an alternative PNG plugin to ensure it was not some kind of special cross-platform metadata needs that somehow occur under the hood for this file format. We have tested SuperPNG as a replacement and this functions as one would expect, opening and saving files properly and without issue.
Has anyone else seen this? We don't have drives this large locally on any of our machines that we can test this outside of the network environment.
We are running fully-patched Creative Suite 3 Design Premium on G5- and Intel-based Macs running a mix of 10.5.x and 10.4.x, all regularly updated.
I look forward to hearing anyone else's experiences with this setup.
Thanks!
Greg

Buko--
Thanks for the response. However, I would hope that issues relating to incompatibility would be one way in which products grow over time.
I think it is fair to think that Adobe can be responsible enough to allow file formats to work on server volumes. No computer is an island, especially today.
I have found a workable solution, and I am not here to fix a bug or complain to the world. I'm here in a discussion group to spark a discussion, as certainly I understand that this is not the forum from which to expect technical support.
I remain interested in others' experiences.
Respectfully,
Greg

Similar Messages

  • When using reader, the font gets very large on last page and I cant change it

    when using reader, the font gets very large on last page and I cant change it

    You cannot edit PDF files with Reader. You can change the View options to make what you see smaller. View -> Zoom

  • Need help optimizing the writing of a very large array and streaming it a file

    Hi,
    I have a very large array that I need to create and later write to a TDMS file. The array has 45 million entries, or 4.5x10^7 data points. These data points are of double format. The array is created by using a square pulse waveform generator and user-defined specifications of the delay, wait time, voltages, etc. 
    I'm not sure how to optimize the code so it doesn't take forever. It currently takes at least 40 minutes, and I'm still running it, to create and write this array. I know there needs to be a better way, as the array is large and consumes a lot of memory but it's not absurdly large. The computer I'm running this on is running Windows Vista 32-bit, and has 4GB RAM and an Intel Core 2 CPU @ 1.8Mhz. 
    I've read the "Managing Large Data Sets in LabVIEW" article (http://zone.ni.com/devzone/cda/tut/p/id/3625), but I'm unsure how to apply the principles here.  I believe the problem lies in making too many copies of the array, as creating and writing 1x10^6 values takes < 10 seconds, but writing 4x10^6 values, which should theoretically take < 40 seconds, takes minutes. 
    Is there a way to work with a reference of an array instead of a copy of an array?
    Attached is my current VI, Generate_Square_Pulse_With_TDMS_Stream.VI and it's two dependencies, although I doubt they are bottlenecking the program. 
    Any advice will be very much appreciated. 
    Thanks
    Attachments:
    Generate_Square_Pulse_With_TDMS_Stream.vi ‏13 KB
    Square_Pulse.vi ‏13 KB
    Write_TDMS_File.vi ‏27 KB

    Thanks Ravens Fan, using replace array subset and initializing the array beforehand sped up the process immensely. I can now generate an array of 45,000,000 doubles in about one second.
    However, when I try to write all of that out to TDMS at the end LV runs out of memory and crashes. Is it possible to write out the data in blocks and make sure memory is freed up before writing out the next block? I can use a simple loop to write out the blocks, but I'm unsure how to verify that memory has been cleared before proceeding.  Furthermore, is there a way to ensure that memory and all resources are freed up at the end of the waveform generation VI? 
    Attached is my new VI, and a refined TDMS write VI (I just disabled the file viewer at the end). Sorry that it's a tad bit messy at the moment, but most of that mess comes from doing some arithmetic to determine which indices to replace array subsets with. I currently have the TDMS write disabled.
    Just to clarify the above, I understand how to write out the data in blocks; my question is: how do I ensure that memory is freed up between subsequent writes, and how do I ensure that memory is freed up after execution of the VI?
    @Jeff: I'm generating the waveform here, not reading it. I guess I'm not generating a "waveform" but rather a set of doubles. However, converting that into an actual waveform can come later. 
    Thanks for the replies!
    Attachments:
    Generate_Square_Pulse_With_TDMS_Stream.vi ‏14 KB
    Write_TDMS_File.vi ‏27 KB

  • Leopard does not support reading/writing to SD card via network?

    I have a Pixma MX700. I can read and write to SD cards using the card reader built into this machine ... IF I use Windows XP. Leopard will mount the card and allow me to view it contents but the minute I try to import a file to my iMac, I get error code 1401.
    I've been told by Canon support that the new MP Navigator EX driver for Leopard disables reading from the memory card over a network. They claim Leopard does not support this function and reading the posts from others having the same issue using different multi-function printers, this would appear to be a Leopard issue, not a Canon issue.
    Can anyone confirm this and/or offer a fix?
    Thanks!

    Hi,
    Could you check for SAP Note   1952701 - DBSL supports new HANA version number
    Regards,
    Gaurav

  • Problems on Large Network, Limited Administrative Pull

    Alright, well I am full of problems here. First let me share the problem. I am on a very large network, over 100 sites, and Tens of thousands of computers, 99% of which are Windows machines. The respect and support for Macs in this school district is non existant, so there is not going to be any favours to help out with these problems, like opening ports. I am merely a Tech Support student at the school, the only one that is willing to help with the little Mac lab we have. We have over 600 computers at the school, only 21 Macs.
    The department that owns the Macs, Yearbook Departement, purchased Apple Remote Desktop 2 last year, a week before 3 came out. Since it was bought, there has been nothing but troubles. I know it is version 2, I am not sure if there are any updates installed. I am hesitant to do that, because I have to go to every machine and upgrade the software.
    Now to the problem, I have looked around and seen really nothing on the specific problem I am having. The issue that I have is when opening Remote Desktop, I see maybe a few machines connected, and it varys which ones they are. All the others either say, Offline, or something like "Remote Desktop not Enabled" Which I know for a fact it is enabled. What I need to know is, what can I do about this? I am just a student with the Admin password for the Macs, that is all I have, and all I can get. If any favour is asked to make it work, I will more than likely be laughed at and made fun of, even by older so called "Pros." It is very immature it seems.
    So the run down of some "fixes" I have seen have been this. Opining ports, not an option, Static IP's, not an option, fowarding ports, not an option, the only thing I have access to are the Macs and the Admin Mac. Also, I have had problems when starting up Remote Desktop and getting an "Unexpectedly Quit" dialogue. And one more question related to licensing. If we own the Unlimited license, can the Remote Desktop Admin software be installed on more than one Admin machine, the school is planning to get a few more Macs for video production. Currently the Macs we have run Panther and Tiger, should be the newest updates of each.
    Thanks for any help, I really need the answers quick, school is about to get out and I am a Senior, so I probably wont be back. And I really feel bad for the Yearbook teacher, because I am the first time anybody has really helped her out with any problems or not poked fun at her or completely ignored her.
     PowerBook G4 15"   Mac OS X (10.4.7)   1.67 Ghz, 1G RAM, 128MB VRAM, 80G HD, DL SuperDrive

    > Well, the DHCP addresses are on a 3 day lease, so it
    shouldn;t reset unless we have a long weekend. If the
    IP on the computer does change, how does the ARD
    Admin find the computers again? That is what seems to
    be the problem, after IP's change, the Admin computer
    can no longer find it.
    One way is to rescan the network looking for the machines. This is not a desirable way to do it, but it works...most of the time.
    The better way is to have the machines check in on a daily basis and update their information. In ARD2, select machine(s), click on Manage -> Set Reporting Policy. By default it is set for midnight, change this to sometime during the day when you know the admin and client machines will be on. Click on Set.
    An important note is that the admin machine should always keep the same IP so the client machines can find it. If the admin machine's IP changes frequently, rescanning may be the only option.

  • When writing an email the letters are so small that I can hardly read them. If I increase the size then the outgoing email letters are very large.

    letters are very large. I have tried changing the font but it doesn't help. This is very annoying. How can I remedy this?

    I use these settings, so try them:
    Tools > Options > Display > Formatting tab
    or
    Menu icon > Options > Options > Display > Formatting tab
    * eg: Default font: Arial
    * size 14
    Note: this sets the default font to be used in both display of messages and the font to use as default in composing messages.
    Under Plain text Messages
    * select: Style : Regular Size: Regular and colour Black
    * Click on 'Advanced' button
    * make sure all sizes - 14
    * select; 'allow messages to use other fonts'
    * Select; 'Use fixed width fonts for plain text messages'
    * click on OK
    Note: This will use the default eg: Arial to be used, but allows the display of messages to use other fonts if the sender specified them.
    then at top click on 'Composition' > General tab
    note: This sets the default composing text colour and background colour.
    under HTML
    * Font: Variable width
    * Size : Medium
    * Select a text colour for writing emails eg: Blue
    * Select background colur eg: white
    click on OK to save and close Options.

  • Have a very large text file, and need to read lines in the middle.

    I have very large txt files (around several hundred megabytes), and I want to be able to skip and read specific lines. More specifically, say the file looks like:
    scan 1
    scan 2
    scan 3
    scan 100,000
    I want to be able to skip move the filereader immediately to scan 50,000, rather than having to read through scan 1-49,999.
    Thanks for any help.

    If the lines are all different lengths (as in your example) then there is nothing you can do except to read and ignore the lines you want to skip over.
    If you are going to be doing this repeatedly, you should consider reformatting those text files into something that supports random access.

  • How do I control read start position in a very large file where start byte position may be larger than I32 (+/- 2^31)?

    Using LabView, I am trying to read a very large file which may be on the order of 2^32 bytes. I need to be able to step into the file at a byte position which may be greater than the I32 limit set by the read file.vi. Are there any options to the read file.vi or a method of circumventing this limitation?

    I'm not sure but i think that you can manage the "pos mode" in the "seek" sub-vi.
    The "pos mode" let you choose the initial position to add the numbers of bytes you want to move.
    I think that you can add a i32 number from the "initial" in the "pos mode" an latter use the "pos mode" in "current" to add another value. Then the next time you can move more than 2^31 bytes to the initial position.
    I hope you understand my idea, i wasn't try it before, but i think that would work.

  • Read very very large excel file into 2d array (strings or varient)

    Hi all,
    Long time user, first time poster on these boards.
    Looking at the copius amounts of great info related to reading Excel data from .xls files into labview, i've found that every one i've found from various people use the ActiveX method (WorkSheet.Range) which two strings are passed, namely excel's LetterNumber format to specify start and end.
    However, this function does not work when trying to query huge amounts of information. The error returned is "Type Mismatch, -2147352571" I have a very large excel sheet i need to read data from and then close the excel file (original file remains unchanged). However this file is gigantic (don't ask me, I didn't make it and I can't convince them to use something more appropriate) with over 165 columns at 1000 rows of data.I can read a large number of columns, but only a handful of rows, or vice versa.
    Aside from creating a loop to open and close the excel file over and
    over reading pieces of it at a time, is there a better way to read more
    data using ActiveX? Attached is code uploaded by others (with very minor modification) as an example.
    Thanks,
    Attachments:
    Excel Get Data Specified Field (1-46col).vi ‏23 KB

    Hi Maddox731,
    I've only had a very quick glance through your thread, and I must admit I haven't really thought it through properly yet. Sounds like you've come up with your own solution anyway. That said I thought I'd take a bit of a scatter gun approach and attach some stuff for you regradless. Please forgive my bluntness.
    You'll find my ActiveX Excel worksheet reader, which may or may not contain the problem you've come across. I've never tried it with the data size you are dealing with, so no promises. I've also attached my ADO/SQL approach to the problem. This was something I moved onto when I realised the limitations of AX. One thing I have noticed is that ADO/SQL is much faster than AX, so there may be some gains for you there with large data sets if you can implement it.
    I should add that I'm a novice to all this and my efforts are down to bits I've gleamed from MSDN and others' LV examples. I hope it's of some use, if only to spark discussion. Your ctiticism is more than welcome, good or bad.
    Regards.
    Attachments:
    Database Table Reading Stuff.zip ‏119 KB

  • Handling very large diagrams in Pages?

    I am writing a book that requires sometimes the use of large diagrams. These are vector-based diagrams (PDF). Originally, I planned to use iBooks Author and widgets to let the user zoom/pan/scroll and use other nice interactive stuff, but after having tried everything I have decided to give up on iBooks Author and iBooks for now because of its dismal handling of images (pixels only, low resolution, limited size only, etc.).
    I am planning to move my project over to Pages. Not having the 'interactive widget'  approach means I need some way to handle large images. I have been thinking about putting very large images in multiple times on different pages with different masks. Any other possible trick? Can I have documents with multiple page sizes? Do I need a trick like the one above or can an ePub book be zoomed/panned/scrolled, maybe using something different to read it than iBooks?

    Peter, that was indeed what I expected. But it turns out iBooks Author can take PDF, but iBooks cannot and iBooks Author renders them to low resolution images (probably PNG) when compiling the .ibook form the .iba.
    Even if you use PNG in the first place, the export function of iBooks Author (either to PDF or to iBook) create low resolution renders.
    The iBooks format is more a web-based format. The problem lies not in what iBooks Author can handle, but in how it compiles these to the iBooks format. It uses the same export function for PDF, making also PDF export ugly and low-res.
    iBooks Author has more drawbacks, for instance, if you have a picture and you change the image inside the picture, you can't. You have to teplace the entire picture. That process breaks all the links to the picture.
    iBooks Author / iBooks is by far not mature.

  • Today, I randomly happened to have less than 1GB of hard drive space left. I found very large "frame" files, what are they?

    I found very large "frame" files, what are they & can I delete them? (See screenshot). I'm a (17 today)-year-old film-maker and can't edit in FCP X anymore because I "don't have enough space". Every time I try to delete one, another identical file creates itself...
    If that can help: I just upgraded to FCP 10.0.4 and every time I launch it it asks to convert my current projects (I know it would do it at least once) and I accept, but everytime I have to get it done AGAIN. My computer is slower than ever and I have a deadline this friday
    I also just upgraded to Mac OS X 10.7.4, and the problem hasn't been here for long, so it may be linked...
    Please help me!
    Alex

    The first thing you should do is to back up your personal data. It is possible that your hard drive is failing. If you are using Time Machine, that part is already done.
    Then, I think it would be easiest to reformat the drive and restore. If you ARE using Time Machine, you can start up from your Leopard installation disc. At the first Installer screen, go up to the menu bar, and from the Utilities menu, first select to run Disk Utility. Completely erase the internal drive using the Erase tab; make sure you have the internal DRIVE (not the volume) selected in the sidebar, and make sure you are NOT erasing your Time Machine drive by mistake. After erasing, quit Disk Utility, and select the command to restore from backup from the same Utilities menu. Using that Time Machine volume restore utility, you can restore it to a time and date immediately before you went on vacation, when things were working.
    If you are not using Time Machine, you can erase and reinstall the OS (after you have backed up your personal data). After restarting from the new installation and installing all the updates using Software Update, you can restore your personal data from the backup you just made.

  • Capture Image Of A Very Large JPanel

    Below is some code used to save an image of a JPanel to a file...
        int w = panel.getSize().width;
        int h = panel.getSize().height;
        BufferedImage image = new BufferedImage(w, h, BufferedImage.TYPE_INT_RGB);
        Graphics graphics = image.getGraphics();
        // Make the component believe its visible and do its layout.
        panel.addNotify();
        panel.setVisible(true);
        panel.validate();
        // Draw the graphics.
        panel.print(graphics);
        // Write the image to a file.
        ImageFile imageFile = new ImageFile("test.png");
        imageFile.save(image);
        // Dispose of the graphics.
        graphics.dispose();This works fine but my problem is that I am trying to save what may be a very large JPanel, perhaps as large as 10000x10000 pixels. It doesn't take long for the java heap to be used up and an exception to be thrown.
    I know I can increase the heap size of the JVM but since I can't ever be sure how large the panel may be that's a far from ideal solution.
    So the question is how do I save an image of a very large JPanel to a file?

    1) Does the OoM happens while instantiating the buffered image, (which probably tries to allocate a big continuous native array of pixels)?
    Or the Graphics object (same reason, though the Graphics is probably just an empty shell over the big pixel array)?
    2) In which format do you need to save the image? Do you only need to be able to read it again in your own program?
    If yes to both questions, then a pulled-by-the-hair solution coud be to instantiate your own Graphics subclass (no Buffered Image), whose operations would save their arguments directly to the image file, instead of into a big in-memory model of the panel image.
    If the output format is a standard one though (GIF, JPG,...), then maybe your custom Graphics's operations could contain the logic to encode/compress as much as possible of the arguments into an in-memory bytearray of the target format?
    I'm not very confident though; I d'ont know the GIF or JPEG encoding, but I suspect (especially for JPEG) that you need to know the "whole" image to encode it properly.
    But if the target format supports encoders that work on the fly out of streams of bytes (e.g. BMP ) then you can use whatever compress/uncompress technique you see fit (e.g. RLE ): you know the nature of the panels, you may be aware of some optimizations you may perform wrt pixels storage. prior to encoding (e.g., bug empty areas, predictable chessboard pattern, black-and-white palette,...).
    Edited by: jduprez on Sep 19, 2009 7:33 PM

  • Best data Structor for dealing with very large CSV files

    hi im writeing an object that stores data from a very large CSV file. The idea been that you initlize the object with the CSV file, then it has lots of methods to make manipulating and working with the CSV file simpler. Operations like copy colum, eliminate rows, perform some equations on all values in a certain colum, etc. Also a method for prining back to a file.
    however the CSV files will probly be in the 10mb range maby larger so simply loading into an array isn't posable. as it produces a outofmemory error.
    does anyone have a data structor they could recomend that can store the large amounts of data require and are easly writeable. i've currently been useing a randomaccessfile but it is aquard to write to as well as needing an external file which would need to been cleaned up after the object is removed (something very hard to guarentee occurs).
    any suggestions would be greatly apprechiated.
    Message was edited by:
    ninjarob

    How much internal storage ("RAM") is in the computer where your program should run? I think I have 640 Mb in mine, and I can't believe loading 10 Mb of data would be prohibitive, not even if the size doubles when the data comes into Java variables.
    If the data size turns out to be prohibitive of loading into memory, how about a relational database?
    Another thing you may want to consider is more object-oriented (in the sense of domain-oriented) analysis and design. If the data is concerned with real-life things (persons, projects, monsters, whatever), row and column operations may be fine for now, but future requirements could easily make you prefer something else (for example, a requirement to sort projects by budget or monsters by proximity to the hero).

  • LOAD UNIT OF COMPONENT IS VERY LARGE (GENERATION LIMIT)

    We are experiencing this meesage when compiling an ABAP WEB DYNPRO Application: "LOAD UNIT OF COMPONENT IS VERY LARGE (GENERATION LIMIT)"
    When Checking the Generation Limits In the Context menu, I have determined our size of Generated Load in bytes is to big.
    The documentation of recommendations is to restructure the program. I am not clear what this means and how this would reduce the Generation Load in bytes. Any ideas would be appreciated.

    > How should we reorganize the application and at the same time ensure smooth and user-friendly handling?
    We only want to use one Explorer window.
    Using multiple components doesn't mean that the user will notice any difference.  Component usages can be embedded within one another.  Using the ALV for instance is a perfect example of a component usage.
    >- Even the SAP reference application "LORD_MAINTAIN_COMP" (37 views) is way too big, according to the recommendation. Is there a better example from SAP?
    I wouldn't consider LORD_MAINTAIN_COMP a reference applicatoin.  It was one of the veryfirst WDA's shipped by SAP before we learned some of these lessons ourselves.  Have a look at the guidelines for Floorplan Manager if you are on 7.01. The FPM provides a very good (and well used by SAP) framework for building large scale WDA applications. 
    >- How could a complex transaction be built and at the same time stay in the green limit area (< 500k
    As described the usage of multiple components avoids the generation limit and is recommended for large scale applications.
    >- What at all is the problem in loading 2 Megabytes of data into memory? Could you please describe the technical background in more detail?
    It has nothing to do with 2Mb into memory.  It has to do with the generation load size in the VM for the generated class that represents your WDA Component.  The ABAP compiler and VM have limits (like all VMs and compilers) on total load size and the maximum size for operations and leaps. Generated code can be extremely verbose.  Under normal conditions, these load limits are almost never reached in human created classes. 
    In 7.02 we backported the 7.20 ABAP complier - which in additon tpbe rewritten to support multipass compelation, also increases some of the load limits.  However the general recommandation about componentization still stands.  Componentization of you WDA application improves maintainabilityand reusability over time.  My personal rule is that if you are getting between 10-12 views in your Component, it is time to think about breaking out into multiple components.
    >- Is there a maximum load size, which would lead to an error (reject of generation)?
    Yes there is.  However the workbench throws warnings well in advance.  At some point it won't even let you add more views to a component. However if you continue to add content to the existing views, you can reach a point where generation fails.

  • How To Get rid of Exponential format in datagridview when the number is very large

    When the number is very large like :290754232, I got 2.907542E +08. in datagridview cell
    I using vb.net , framework 2.0.
    how can I get rid of this format?
    Thanks in advance

    should I change the type of this column to integer or long ?
    The datagridview is binded to binding source and a list ( Of).
    Mike,
    I'll show you an example that shows the correct way to do this and a another way if you're stuck using strings in exponential format. The latter being the "hack way" I spoke about Friday. I don't like it, it's dangerous, but I'll show both anyway.
    In this example, I'm using Int64 because I don't know the range of yours. If your never exceeds Int32 then use that one instead.
    First, I have a DataGridView with three columns. I've populated the data just by creating longs starting with the maximum value in reverse order for 100 rows:
    The way that I created the data is itself not a great way (there's no encapsulation), but for this example "it'll do".
    Notice though that the third column (right-most column) isn't formatted at all. I commented out the part that does that so that I could then explain what I'm doing. If it works, it should look like the first column.
    The first column represents an actual Int64 and when I show the code, you can see how I'm formatting that using the DGV's DefaultCellStyle.Format property. That's how it SHOULD be done.
    The third column though is just a string and because that string contains a letter in it, Long.TryParse will NOT work. This is where the "hack" part comes in - and it's dangerous, but if you have no other option then ...
    You can see that now the third column matches the first column. Now the code:
    Option Strict On
    Option Explicit On
    Option Infer Off
    Public Class Form1
    Private Sub Form1_Load(ByVal sender As System.Object, _
    ByVal e As System.EventArgs) _
    Handles MyBase.Load
    With DataGridView1
    .AllowUserToAddRows = False
    .AllowUserToDeleteRows = False
    .AllowUserToOrderColumns = False
    .AllowUserToResizeRows = False
    .AlternatingRowsDefaultCellStyle.BackColor = Color.Aquamarine
    .ReadOnly = True
    .SelectionMode = DataGridViewSelectionMode.FullRowSelect
    .MultiSelect = False
    .RowHeadersVisible = False
    .RowTemplate.Height = 30
    .EnableHeadersVisualStyles = False
    With .ColumnHeadersDefaultCellStyle
    .Font = New Font("Tahoma", 9, FontStyle.Bold)
    .BackColor = Color.LightGreen
    .WrapMode = DataGridViewTriState.True
    .Alignment = DataGridViewContentAlignment.MiddleCenter
    End With
    .ColumnHeadersHeightSizeMode = DataGridViewColumnHeadersHeightSizeMode.DisableResizing
    .ColumnHeadersHeight = 50
    .DataSource = Nothing
    .Enabled = False
    End With
    CreateData()
    End Sub
    Private Sub CreateData()
    Dim longList As New List(Of Long)
    For l As Long = Long.MaxValue To 0 Step -1
    longList.Add(l)
    If longList.Count = 100 Then
    Exit For
    End If
    Next
    Dim stringList As New List(Of String)
    For Each l As Long In longList
    stringList.Add(l.ToString("e18"))
    Next
    Dim dt As New DataTable
    Dim column As New DataColumn
    With column
    .DataType = System.Type.GetType("System.Int64")
    .ColumnName = "Actual Long Value (Shown Formated)"
    dt.Columns.Add(column)
    End With
    column = New DataColumn
    With column
    .DataType = System.Type.GetType("System.String")
    .ColumnName = "String Equivalent"
    dt.Columns.Add(column)
    End With
    column = New DataColumn
    With column
    .DataType = System.Type.GetType("System.String")
    .ColumnName = "Formated String Equivalent"
    dt.Columns.Add(column)
    End With
    Dim row As DataRow
    For i As Integer = 0 To longList.Count - 1
    row = dt.NewRow
    row("Actual Long Value (Shown Formated)") = longList(i)
    row("String Equivalent") = stringList(i)
    row("Formated String Equivalent") = stringList(i)
    dt.Rows.Add(row)
    Next
    Dim bs As New BindingSource
    bs.DataSource = dt
    BindingNavigator1.BindingSource = bs
    DataGridView1.DataSource = bs
    With DataGridView1
    With .Columns(0)
    .DefaultCellStyle.Format = "n0"
    .Width = 150
    End With
    .Columns(1).Width = 170
    .Columns(2).AutoSizeMode = DataGridViewAutoSizeColumnMode.Fill
    .Enabled = True
    End With
    End Sub
    ' The following is what I commented
    ' out for the first screenshot. ONLY
    ' do this if there is absolutely no
    ' other way though - the following
    ' casting operation is NOT ADVISABLE!
    Private Sub DataGridView1_CellFormatting(ByVal sender As Object, _
    ByVal e As System.Windows.Forms.DataGridViewCellFormattingEventArgs) _
    Handles DataGridView1.CellFormatting
    If e.ColumnIndex = 2 AndAlso e.Value.ToString IsNot Nothing Then
    ' NOTE! The following is dangerous!
    ' I'm going to use coercion to force the
    ' string into a type long. TryParse will
    ' NOT work here. This can easily throw an
    ' exception if the string cannot be cast
    ' to a type long. I'm "depending on" the
    ' the string to cast. At the very least
    ' you might put this in a Try/Catch but
    ' that won't stop it from failing (if
    ' it doesn't work).
    Dim actualValue As Long = CType(e.Value.ToString, Long)
    Dim formattedValue As String = actualValue.ToString("n0")
    e.Value = formattedValue
    End If
    End Sub
    End Class
    Like I said, only use that hack way if there's no other option!
    I hope it helps. :)
    Still lost in code, just at a little higher level.

Maybe you are looking for

  • How can I remove an email address from one apple account and associate it with another?

    I created an appleID account to use with the mac AppStore without realising i previously had an appleID account from my first iPod touch. Now with OSX 10.8 I want to use the email address from the old account with the new account for Messages.app wit

  • Bad Quality in Canvas and Exported Movie

    Hi, I'm a new user of FCE. I want to use it for presentation of my software on the internet. I have a screencapture video in quicktime format. The resolution is 1008x744px. This quality is perfect. If i use this Video in a sequenz, the quality in Can

  • Linked image in mail signature?

    Hello all, I'm trying to create a link between a logo in my e-mail signature and our website. Somehow this doesn't seem to be an option in Apple Mail. Who can help? Thanks! Lianne Message was edited by: Lianne Alink

  • Scheduling Agreement u2013 Subcontracting

    Hi All, I have Issue for following process. 1. I Created Scheduling Agreement with item category L (Subcontracting) for Material Fin001 2. I also created BOM for Fin001and Fin001 BOM component is sem001.  3. When I issuing a material sem001 to subcon

  • Changed from Version 4.0 to 7.0

    Hello all I hope you can help me to fix a problem. I was running PSE Version 4.0 on XP, then bought a new computer which runs Vista. Version 4.0 isn't running well. I can import photos und create photoshows. But I can't import any audio in my photosh