"Acceptable" source file size...

Hi all!
I know that there is no standard for this, but what does good and practical programming suggests as a good number of lines in a source file, keeping in mind good design and simplicity of code?

looking at my current project.....
The smallest file is 209bytes or 13 lines.
But lines probably isn't the best way to judge the size of the class..... Especially since the actual body of the code contains nothing! (All 13 lines are overhead: package, import, javadoc) Its an interface to hold a key.
The largest file is 43.1kb or 1374. The class represents a modified version of java.security.Signature. (And if I removed the unecessary code [currently commented out] I could probably knock 300lines out.
The average file is ~4kb which equates to ~70-80lines.
However, whether you can actually compare a University (final year) project to a real world project is open to discussion.
Personally I don't think lines of code is a good metric. Especially as different programmers adopt different coding styles, which will affect line count (making it impossible to compare work of different programmers)

Similar Messages

  • Essbase Source file size limitation

    Hello,
    >
    Is there any source file size restriction on essbase?
    I tried to load a txt source file which is 2GB in size. I was able to load
    this text file to essbase till it reaches to 1.9GB. When it reaches to 2GB
    i am getting the following error.
    ERROR - 1030100 - Cannot open file: [essdata2/data/vista/vstcy.txt].
    ERROR - 1241101 - Unexpected Essbase error 1030100.
    Based on the Essbase DBA guide there is no size limitation.
    Is there anyway to load the txt file source which is bigger in size(2GB+)
    thanks.

    I think it is a limitation that Essbase(?)* has on that operating system. I have seen this issue before with files 2gb+ that load fine on a windows box, but not on Unix.
    Maybe others on here will have a better option, I got round the problem by splitting my load file up into smaller chunks so that it would load.
    *Actually on further reading this could be a file system limitation.                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                           

  • IDVD not understanding source file size.

    I'm trying to make a DVD with several different movies. They have all been encoded the same way, but have different lengths. The shorter clips are under 15 minutes, but the longer clip is over and hour long. I re-encoded the longer clip to bring down the file size and it is currently just 600 MB. iDVD sees it as over 4GB. Am I doing something wrong, or is iDVD messing up?

    iDvd doesn't look at file size but rather duration / QT Playback. The most you can get onto a SL DVD-R is 2 hours or 120 mins. And about twice that for DL media.
    How long is the combined length of all your movies? Hopefully the combined length isn't over 120 minutes (assuing you plan to use single layer Verbatim or Maxell Dvd-R). If your movies exceed 2 hours then you may wish to use Dual Layer instead of Single Layer Dvd's (assuming your mac will write to this newer media).
    Hope this info helps but if not just come on back.
    Disclaimer: Apple does not necessarily endorse any suggestions, solutions, or third-party software / products that may be mentioned in this topic. Apple encourages you to first seek a solution at Apple Support. The following links are provided as is, with no guarantee of the effectiveness or reliability of the information. Apple does not guarantee that these links will be maintained or functional at any given time. Use the information above at your own discretion.

  • Add a file size option for TIF export

    I would like to have an option to set (uncompressed) file size (or alternatively size in MPix) for a TIF file and have lightroom set the pixel dimensions accordingly. The reason is that many libraries specify a file size requirement. If all your images are cropped the same then working out the dimensions is simple, but if they are all different shapes then it becomes a real pain as you have to work out the dimensions for each one.
    Something somewhat similar has been mentioned for jpeg output, to allow you to specify file size and have lightroom adjust quality to achieve the filesize you want (leaving pixel dimensions fixed).
    ...rob

    If by filesize you mean dimensions (e.g. cropped), then that is built into Lr5 - via smart collection, not library filter.
    If you mean actual source file size (e.g. in bytes), that requires either:
    * John Ellis' AnyFilter, or
    * Jeffrey Friedl's Data Explorer.
    (if there are others, I don't know about them)
    I have plugins for lotsa things, but not everything .
    Cheers,
    Rob

  • Advanced Selection for Source File + SourceFileSize = invalid file size?

    Hi !
    I have a File Adapter using FTP protocol. I've turn on all Adapter-Specific Message Attributes, including SourceFileSize. I've activated the Advanced Selection for Source File, and entered several patterns.
    When I see my payload via the monitor, I get a "-" in the SourceFileSize value.
    <sap:Record namespace="http://sap.com/xi/XI/System/File" name="SourceFileSize">-</sap:Record>
    If I disable the "Advanced Selection for Source File", the file size is ok.
    Is it just me?
    Thanks

    maybe u need to check for some notes

  • Event ID: 4, Source: Microsoft-Windows-Kernel-EventTracing, maximum file size for session "ReadyBoot" has been reached.

    Hello,
    I upgraded my machine to Win7 x64 Pro about 3 weeks ago. My HW is an Asus mobo, Intel Q9450 w/8GB RAM. The boot drives are two Raptors configured as RAID01. All the drivers are the latest available from Intel, Asus and 3rd party vendors. My WEI is 5.9, limited by the disk transfer rates, otherwise 7.1 and 7.2 on the other indexes.
    I've been receiving these errors at boot;
    Log Name:      Microsoft-Windows-Kernel-EventTracing/Admin
    Source:        Microsoft-Windows-Kernel-EventTracing
    Date:          11/10/2009 7:51:03 AM
    Event ID:      4
    Task Category: Logging
    Level:         Warning
    Keywords:      Session
    User:          SYSTEM
    Computer:      herbt-PC
    Description:
    The maximum file size for session "ReadyBoot" has been reached. As a result, events might be lost (not logged) to file "C:\Windows\Prefetch\ReadyBoot\ReadyBoot.etl". The maximum files size is currently set to 20971520 bytes.
    Event Xml:
    <Event xmlns="http://schemas.microsoft.com/win/2004/08/events/event">
      <System>
        <Provider Name="Microsoft-Windows-Kernel-EventTracing" Guid="{B675EC37-BDB6-4648-BC92-F3FDC74D3CA2}" />
        <EventID>4</EventID>
        <Version>0</Version>
        <Level>3</Level>
        <Task>1</Task>
        <Opcode>10</Opcode>
        <Keywords>0x8000000000000010</Keywords>
        <TimeCreated SystemTime="2009-11-10T12:51:03.393985600Z" />
        <EventRecordID>28</EventRecordID>
        <Correlation />
        <Execution ProcessID="4" ThreadID="164" />
        <Channel>Microsoft-Windows-Kernel-EventTracing/Admin</Channel>
        <Computer>herbt-PC</Computer>
        <Security UserID="S-1-5-18" />
      </System>
      <EventData>
        <Data Name="SessionName">ReadyBoot</Data>
        <Data Name="FileName">C:\Windows\Prefetch\ReadyBoot\ReadyBoot.etl</Data>
        <Data Name="ErrorCode">3221225864</Data>
        <Data Name="LoggingMode">0</Data>
        <Data Name="MaxFileSize">20971520</Data>
      </EventData>
    </Event>
    The image for PID 4 is listed as System.
    My searches have turned up similar events listed but no solutions.
    Any help would be appreciated.
    Cheers!

    Session "Circular Kernel Context Logger" failed to start with the following error: 0xC0000035
    As suggested above I assume this is a microsoft issue?  It has been discussed here and other forums for quite some time.  I never have seen a fix?  I wish when we received errors of this nature microsoft would tell us what they were.  How is this related to superfetch?  What is superfetch?  Why would superfetch have changed?
    BY THE WAY....  Superfetch is on(started) is on automatic and logs on as local system.  So this is not the cause of my issue.  Also what is readyboot?  Does the average computer really know what these programs/services or unique microsoft words/terms are?
    System
    Provider
    [ Name]
    Microsoft-Windows-Kernel-EventTracing
    [ Guid]
    {B675EC37-BDB6-4648-BC92-F3FDC74D3CA2}
    EventID
    2
    Version
    0
    Level
    2
    Task
    2
    Opcode
    12
    Keywords
    0x8000000000000010
    TimeCreated
    [ SystemTime]
    2010-04-11T14:35:49.829600000Z
    EventRecordID
    25
    Correlation
    Execution
    [ ProcessID]
    4
    [ ThreadID]
    48
    Channel
    Microsoft-Windows-Kernel-EventTracing/Admin
    Computer
    Daddy-PC
    Security
    [ UserID]
    S-1-5-18
    EventData
    SessionName
    Circular Kernel Context Logger
    FileName
    ErrorCode
    3221225525
    LoggingMode
    268436608
    Windows7, Windows, Win7

  • How can I change the size of a pdf source file, or, convert it to Word?

    How can I change the size of a pdf source file, or, convert it to Word?

    A lot depends on the form of the PDF. Is it graphics, a scan of a text file, pure text, or other? What version of Acrobat are you working with.
    You can do a save as to get a WORD file, but do not expect great results. The ability to get a decent WORD file depends on what the form of PDF you are working from. If it was created from WORD with tags and all, you might get good results. If not, you might get a lot of messed up results.
    Explain what you are starting with and your ultimate goal. Also check the audit of the file (should be under PDF Optimize) to see where the file information is concentrated (text, fonts, graphics, other).

  • TFS Preview source control file size limit

    Is there a file size limit for adding large files to a TFS Preview project's source control..?   I would like to add some files to source control that are in the 20-30Mb range and the requests keep aborting.

    Hi Erich,
    Thank you for your post.
    I test the issue by adding .zip file to TFS Azure version comtrol, the size include 10MB, 20MB, 100MB, 2GB, i can add they to version control properly without any problem.
    In order narrow down the issue, here are some situations i want to clarify from you:
    1. What's the error message when TFS azure block the check-in process? Could you share a screen shot?
    2. Which client tool are you using? VS10 or VS11? Or other tools?
    3. Try to reproduce this in different network environment and see what will happen.
    4. Capture a fiddler trace which can help to check the issue.
    Thanks,
    Lily Wu [MSFT]
    MSDN Community Support | Feedback to us

  • Hyper-V 2012 R2 creating a new vhdx and copying from a source vhdx will physical file size reduce?

    Hi,
    I'm running Hyper-V 2012 R2 as part of Server 2012 Essentials.
    As seems common from other threads I've read I have created an expanding VHDX with a maximum size too large (almost the same size as the physical disc it lives on). Although there is only around 350GB of data in the VHDX, the actual file size is approaching
    900GB on the host.
    I'm using the "Add New Virtual Disk" wizard to make a new VHDX to replace this massive file. I want the contents of the old VHDX to be copied to the new one so in the add disk wizard I selected the option to copy the contents of my new disk from
    the original.
    So my question is: the source VHDX is 900GB, the data inside the is around 350GB. What size is my new VHDX likely to be? Is it likely to reduce to nearer the actual data size?
    I'm asking because I'm currently running this operation to an external USB drive and it's painfully slow - going to take hours to complete. If the size isn't going to reduce I may as well cancel it and look for a better option.
    Any comments appreciated.
    David

    I am assuming that you are using dynamic virtual disks for your VHDX (they can be dynamic or fixed).
    And, it is usually some type of transaction, backup agent, of the like that causes this type of behavior.  Some action that requires a cache on disk that is later deleted.
    There is a shrink action that can be done, but that will not optimize fully.
    The way to fully optimize is to mount the old virtual disk and create a new virtual disk and mount it, and then copy the contents from one to the other.  This lays the files out in a nice orderly way, without anything being written to the end of the
    virtual disk causing it to expand strangely.
    Over time, if you are still using the backup agent (for example) that got you into this situation, you will be there again.
    Brian Ehlert
    http://ITProctology.blogspot.com
    Learn. Apply. Repeat.

  • Same source, different jar file sizes

    What could be creating this discrepancy in jar file sizes on two machines with exactly the same versions of SDK etc installed.
    Machine 1 - obfuscated jar size 148Kb
    Source moved to Machine 2 - obfuscated jar size 222Kb
    Source moved back to Machine 1, mods made, - obfuscated jar size 143Kb
    Source copied to Machine 2, NO mods, - obfuscated jar size 138Kb
    I want to retire Machine 1, and in fact Machine 2 is due to be re-purposed as well when my new shiny one arrives but I am worried I won't be able to get my file sizes as small as possible or at least cram as much as possible in so I would like to know what sort of things impact the packaging process.
    Thanks
    Kathydb

    When you sync photos with iTunes it automatically optimizes the photo size for the screen on the device it is syncing it to. Photo Stream also has some of this functionality built into it as well although a full resolution photo will be sent to iPhoto on Mac computers. Photos taken with different devices will also have different file sizes depending on how many megapixels the camera on the device has.

  • Source structure is not accepting the file

    Hi,
    I have imported an external definition but its not accepting the file structure send by user..Its showing redcolour fields in the mapping of the test tab aslo when i pasted the payload..
    This is the sample structure of the xml file from the test tab of mapping of my external definition..
    <?xml version="1.0" encoding="UTF-8"?>
    <ns0:MT_Policy xmlns:ns0="http://www.policy.en/XI/Swiss">
       <heading>
          <billcode>1</billcode>
          <version>01</version>
          <source>123</source>
          <destination>424</destination>
          <date>43</date>
          <hour>4234</hour>
          <code>324</code>
          </heading>
    This is the file sent by the user
    <?xml version='1.0'encoding='ISO-8859-1'?>
    <MT_Policy>
    <heading>
    <billcode>420054842</billcode>
    <version>001</version>
    <source>EN12</source>
    <destination>0230</destination>
    <date>2011-01-04</date>
    <hour>14:09:03</hour>
    <code>01</code>
    </heading>
    Please let me know,for me it looks like the strctures are same..Why is it not accepting..

    Hi,
    it seems you have created data and message type for the structure, thats why it is having NS0 tags and XML name space,
    even encoding also different.
    you have to handle this,best way is convert your test data in to XSD using any XML tool,and import it..it will work.
    or
    remove XML Name space in message type and try it,i think it will work.
    Regards,
    Raj

  • Acceptable File Size and Resolution for Still Photos in Keyframe Motion

    Hello Hello
    I know that FCE HD (as other Video Editors), will automatically fit a still photo's size to it's default capability of 720 x 480 Correct?
    Well, in regard to making the best possible ZOOMS, and Pans from Still Photos, is it neccessary to re-size all your photos (from an external editor like photoshop), to a specfic size?
    According to an older book for FCE, it sez that actually photos re-sized to 720x534 with a resolution of 72 are best. *The proper proportion for NTSC.
    And I noticed that when I inserted larger JPG still sizes (like about a meg), on the FCE timline, and when applying Panning, or Zooms, some of the rendered images with motion didn't pan or zoom smoothly.
    Is that because those file sizes were TOO large?
    Well I have soooo many stills that need to be imported into this particular project that it would be a lot easier for me to just generally reduce these stills (by percentage).
    And is the magic resolution amount supposed to be 72?
    Because if that's the case I can re-size all my photos to 72, and try to keep the file sizes down to say 300 KB's or so.
    Sound like a plan?

    Thank you guys
    "basically, you should try to make your images large enough so they never have to be scaled beyond 100% in the motion tab in fcp."
    Actually as I mentioned some of these file sizes are very large. About 800-900 KB's - and what as a result, what I've noticed is that I actually had to DECREASE the viewing area in both the Browser Window, and the Canvas Winbow just to be able to see the whole image.
    Ok - so the 72 DPI is not as important as the dimensions of the photo. But as I said I have soooooo many of these that it would take me forever to manually re-size them all not to mention the fact that re-sizing some of them (odd shapes),would throw the images out of balance.
    So again I ask - if there is NO motion applied to the photo, file sizes of about 200 to 350 KB's appear just beautifully ....
    BUT
    If I need to PAN or ZOOM, is it ok to laod a 1 MEG sized JPG onto the timeline and start working with Keyframe?????
    In fact in some cases I'm actually using these larger file JPS's (1 meg or more), so that they TOTALLLY fit the canvas window (cropping them by enlargment)
    *With some clipping of the original image of course ...
    Why do I do this?
    So that you don't see the usual border with horizontal images, or vertical images - know what I mean?
    Thanx

  • Get the file size of a file on the server.

    okay.. for my download manager i need to get the file size. I tried using the available method of inputstream, and the getContentLength() of URLConnection, which worked but only for webpages. so does anyone know how to do this? Sun does it in SDM, but i dont think the source is included. ill download it again and check.
    thanks.

    content-length is not guaranteed to be anythingi dont think so. correct me if i am wrong..
    http://www.w3.org/Protocols/rfc2616/rfc2616-sec4.html#s
    c4.4 states -
    " When a Content-Length is given in a message where a
    message-body is allowed, its field value MUST exactly
    match the number of OCTETs in the message-body.
    HTTP/1.1 user agents MUST notify the user when an
    invalid length is received and detected."
    this sort of guarantees that if a file is being
    downloaded the content-length gives the size of the
    file. i am able to confirm this by sniffing the
    traffic of my download manager.This is only guaranteed if the content length field is present. :-)
    However, looking two paragraphs up:
    'All HTTP/1.1 applications that receive entities MUST accept the "chunked" transfer-coding (section 3.6), thus allowing this mechanism to be used for messages when the message length cannot be determined in advance.'
    There is mechanism built in for cases where the size cannot be determined.
    Regards,
    Bhaveet

  • Large folio file size

    We are half way through a book that comprises 100 single page articles. However it is already nearly 500 MB and this isn't sustainable.
    Does the following affect the file size:
    Is the Folio file size affected by the number of individual articles, would it be smaller if we had stacks of say 10 articles each with 10 pages rather than 100 single pages?
    Every page has a two picture (JPG) object state, the first image is an extreme elargement of the image that visible only for a about a second before the full frame image appears. Each page has a caption using Pan overlay that can be dragged into the page using a small tab.Does an Object State increase the file size over and above the images contained within it?
    We have reduced the JPGs to the minimum acceptable quality and there is no video in the Folio.
    Any ideas would be much appreciated?

    800 MB worth of video sounds crazy.
    Of course, a high number of videos can bring you to that.
    I saw bigger dps apps. I think the apple limit lies around 4 gb (remember,
    that is more than 25% of a whole 16 gb iPad)
    The mp4 video codec does a really good job while keeping the quality high.
    And the human eye is more forgiving to quality when it comes to moving
    images compared to still imagery.
    I wrote a collection of tipps and ideas how to reduce your file size.
    http://www.google.de/url?sa=t&source=web&cd=1&ved=0CB4QFjAA&url=http%3A%2F%2Fdigitalpublis hing.tumblr.com%2Fpost%2F11650748389%2Freducing-folio-filesize&ei=uVbeTv_yD--M4gTY_OWbBw&u sg=AFQjCNHroLkcl-neKlpeidULpQdosl08vw
    —Johannes
    (mobil gesendet. fat fingers. beware!)
    Am 06.12.2011 18:32 schrieb "gnusart" <[email protected]>:
       Re: Large folio file size  created by gnusart<http://forums.adobe.com/people/gnusart>in
    Digital Publishing Suite - View the full discussion<http://forums.adobe.com/message/4067148#4067148>

  • Significant reduction in file size from Camera Raw to DNG

    Hi,
    I am currently testing the conversion of Leaf camera raw files into DNGs for a photographer's archive. I am hoping to convert all of the mos files to DNGs because Leaf Capture and the Leaf Raw Converter are not being updated and because the photographer wants to have an Adobe centered workflow. In my testing I discovered that converting mos files to DNGs through ACR 8.4 and LightRoom 5.4 resulted in a reduction of file size by nearly 50%. A 44.5MB mos file became a 23.6MB DNG. From what I've read only about 15-20% of the camera raw file should be lost and all of the data lost should be propietary.
    Here-in lies my quesiton, is there any way that I can track or determine exactly what sort of compression is being done to the mos file and what information is or is not travelling in the conversion to DNG?
    These are the settings I have used for converting raw files to DNGs:
    ACR:
    JPEG Preview: Medium Size
    Embed fast load data
    Don't use lossy compression
    Preserve pixel counts
    Don't embed original
    LIGHTROOM 5.4:
    Only Convert Raw files
    Delete originals after successful conversion
    File Extension DNG
    Compatibility Camera Raw 7.1 and later
    Jpeg Preview Medium Size
    Embed Fast Load Data
    Thanks!

    50%? - I thought we were talking about 15-20%?
    In my first post I questioned why I was seeing a reduction in file size of 50% when according to forums and articles I've read I should only be seing a 15-20% reduction in file size. I then wondered what data I might be losing, which you addressed.
    Same as what? - what were the results.
    I was referring to testing I preformed on camera raw files produced during different years (all mos). I converted all files with the same ACR and LR settings and found that the DNGs always reflected a 50% reduction in file size. This test suggests that any conversion issues is not necessarily related to how the camera raw files might have been differently built across years.
    Adobe's raw data compression is touted by DNG zealots, but I haven't scrutinized it enough to corroborate or refute.., but my experience is that reduction is relatively marginal. All of this assumes original is also compressed - if uncompressed in original source, savings would be large.
    The files I am dealing with are definitely uncompressed which could account for the large reduction in file size. I didn't realize until I posted to this thread that converting to a DNG results in a compression of the original image data. I understand that this compression is supposed to be lossless like a lossless compression to a tiff and thus result in no decrease in image quality or harm to the original image. I am baffled by how it is possible that any compression of a file (especially  by 50%) could not result in a loss of important data but I will accept that it is possible to have a truly lossless compression and that the size reduction I am seeing could be a result of all of the different processes a file undergoes that you have outlined.
    I looked into the effects that backwards compatibility has on the conversion process which might interest you http://dpbestflow.org/DNG#backwards-compatibility
    I also posted to luminous landscape's forums http://www.luminous-landscape.com/forum/index.php?topic=89101.new;topicseen#new
    Although it wouldn't surprise me if the DNG conversion process tossed the xmp-like metadata, and kept the original stuff, but it would surprise me if it tossed the original stuff - but as I said before, I haven't scrutinized for completeness so I don't know.
    I've done testing in which I converted .mos camera raw files with their sidecar xmps and without their sidecar xmps. My tests revealed that the DNG definitely carries over xmp metadata although it is not clear to me exactly how it is carried and if anything is lost.

Maybe you are looking for

  • HDMI Cable for Toshiba GigaShot 40

    Hi, I bought Gigashot Camescope, and a new TV with an HDMI port. The HDMI port of the Camescope Toshiba GigaShot 40 is thiner than cable for my TV. First: is it a real HDMI port? or an i-Link/Firewire port? Second: how can I use this Camescope HDMI P

  • [Locked] Replacing the Deprecated target Attribute

    QUESTION:  How does one replicate the behavior of the target attribute now that it has been deprecated?  My website requires that all pages that are not contained within my website open to new windows. DISCUSSION:  The behavior that I am trying to re

  • Need A scenario based step by step process for DTP

    Hi BWS, Can any body give me a scenario based step by step documnet for Data Trasper Process(DTP).. Cheers Mak

  • Authorization object to display BSP check boxes

    Hello, I'm trying to find the authorization object so as to display the BSP check boxes. Because I can display BSP properly (I've BSP_APPL) but not the BSP check boxes. The user having all the autorizations, can see the cheks boxes ok. Please, can yo

  • Concerns buying cellular Ipad air in US for use in Denmark

    As the title states. Is it true, I need the GSM version? Are the iPads from Apple unlocked? Will I be forced to pay the American data subscription? It seems when I go to the order page there's no option to not start a data subscription. Thanks for an