MP4/H264 file sizes very different

I seem to have the problem that when I convert files from AVI to MP4 or H264 they become remain rather large.
I converted a movie of 124 minutes to H264 and the file is 494 MB, while I also have a 154 minute movie which is only 3352 MB. In itunes info it says the 124 minute movie is low complexity h.264 320x240 film at bit rate 64 kbps (total bit rate 539 kbps). It is 494 MB or 3.9 MB/min
the 154 minute movie info gives low complexity MPEG-4 320x240 film at bit rate 63 kbps (total bit rate 313 kbps). It is 352 MB or 2.3 MB/min
I am using total video converter to convert, I leave all the settings at default, I just select H264 and change the dimensions to 320x240. Does anyone have any idea how i can get the h.264 or mpeg-4 filesto be smaller??

H.264 is higher quality than MP4, therefore larger file size.

Similar Messages

  • After duplicate operation, file sizes(Checkpoint file size) are different

    HI
    I have a some questions.
    We are testing a 4-way Replication. After duplicate operation, file sizes(Checkpoint file size) are different in OS command(du -sh).
    Is the normal?
    TimesTen Version : TimesTen Release 7.0.5.0.0 (64 bit Solaris)
    OS Version : SunOS 5.10 Generic_141414-02 sun4u sparc SUNW,SPARC-Enterprise
    [TEST17A] side
    [TEST17A] /mmdb/DataStore # du -sh ./*
    6.3G ./SAMPLE
    410M ./SAMPLE_LOG
    [TEST17A] /mmdb/DataStore/SAMPLE # ls -lrt
    total 13259490
    -rw-rw-rw- 1 timesten other 501 Aug 14 2008 SAMPLE.inval
    -rw-rw-rw- 1 timesten other 4091428864 Jan 29 02:13 SAMPLE.ds1
    -rw-rw-rw- 1 timesten other 4113014784 Jan 29 02:23 SAMPLE.ds0
    [TEST17A] /mmdb/DataStore/SAMPLE # ttisql sample
    Command> dssize ;
    PERM_ALLOCATED_SIZE: 8388608
    PERM_IN_USE_SIZE: 36991
    PERM_IN_USE_HIGH_WATER: 36991
    TEMP_ALLOCATED_SIZE: 524288
    TEMP_IN_USE_SIZE: 5864
    TEMP_IN_USE_HIGH_WATER: 6757
    [TEST17B] side
    [TEST17B] /mmdb/DataStore # du -sh ./*
    911M ./SAMPLE
    453M ./SAMPLE_LOG
    [TEST17B] /mmdb/DataStore/SAMPLE # ls -lrt
    total 1865410
    -rw-rw-rw- 1 timesten other 334 Dec 11 2008 SAMPLE.inval
    -rw-rw-rw- 1 timesten other 4091422064 Jan 29 02:25 SAMPLE.ds1
    -rw-rw-rw- 1 timesten other 4091422064 Jan 29 02:25 SAMPLE.ds0
    [TEST17B] /mmdb/DataStore/SAMPLE # ttisql sample
    Command> dssize;
    PERM_ALLOCATED_SIZE: 8388608
    PERM_IN_USE_SIZE: 432128
    PERM_IN_USE_HIGH_WATER: 432128
    TEMP_ALLOCATED_SIZE: 524288
    TEMP_IN_USE_SIZE: 5422
    TEMP_IN_USE_HIGH_WATER: 6630
    [TEST18A] side
    [TEST18A] /mmdb/DataStore # du -sh ./*
    107M ./SAMPLE
    410M ./SAMPLE_LOG
    [TEST18A] /mmdb/DataStore/SAMPLE # ls -lrt
    total 218976
    -rw-rw-rw- 1 timesten other 4091422064 Jan 29 02:22 SAMPLE.ds0
    -rw-rw-rw- 1 timesten other 4091422064 Jan 29 02:32 SAMPLE.ds1
    [TEST18A] /mmdb/DataStore/SAMPLE # ttisql sample
    Command> dssize;
    PERM_ALLOCATED_SIZE: 8388608
    PERM_IN_USE_SIZE: 36825
    PERM_IN_USE_HIGH_WATER: 37230
    TEMP_ALLOCATED_SIZE: 524288
    TEMP_IN_USE_SIZE: 6117
    TEMP_IN_USE_HIGH_WATER: 7452
    [TEST18B] side
    [TEST18B] /mmdb/DataStore # du -sh ./*
    107M ./SAMPLE
    411M ./SAMPLE_LOG
    [TEST18B] /mmdb/DataStore/SAMPLE # ls -lrt
    total 218976
    -rw-rw-rw- 1 timesten other 4091422064 Jan 29 02:18 SAMPLE.ds1
    -rw-rw-rw- 1 timesten other 4091422064 Jan 29 02:28 SAMPLE.ds0
    [TEST18B] /mmdb/DataStore/SAMPLE # ttisql sample
    Command> dssize;
    PERM_ALLOCATED_SIZE: 8388608
    PERM_IN_USE_SIZE: 36785
    PERM_IN_USE_HIGH_WATER: 37140
    TEMP_ALLOCATED_SIZE: 524288
    TEMP_IN_USE_SIZE: 5927
    TEMP_IN_USE_HIGH_WATER: 7199
    Thank you very much.
    GooGyum

    You don't really give much detail on what operations were performed and in what sequence (e.g. duplicate from where to where...) nor if there was any workload running when you did the duplicate. In general checkpoint file sizes amongst replicas will not be the same / remain the same because:
    1. Replicas are logical replicas not physical replicas. Replication transfers logical operations and applies logical operations and even if you try and do exactly the same thing at both sides in exactly the same order there are internal operations etc. that are not necessarily synchronised which will cause the size of the files to vary somewhat.
    2. The size of the file as reported by 'ls -l' represents the maximum offset that has so far been written to in the file but the current 'usage' of the file may be less than this at present.
    3. Checkpoint files are 'sparse' files (unless created with PreAllocate=1) and so the space used as reported by 'du' will in general not correspond ot the size of the file as reported by 'ls -l'.
    Unless you are seeing some kind of problem I would not be concerned at an apparent difference in size.
    Chris

  • I have a few hundred duplicates in my iPhoto library, but the file sizes are different.  So one is 1.3mb and one is 567kb.  I want to delete the smaller ones, but short of comparing each duplicate, is there a way to do this?

    I have a few hundred duplicates in my iPhoto library, but the file sizes are different.  So one is 1.3mb and one is 567kb.  I want to delete the smaller ones, but short of comparing each duplicate, is there a way to do this?  I've been looking at Duplicate Annhilator but I don't think it can do it.
    Thanks!

    I just ran a test with iPhoto Library Manager, Duplicate Annihilator, iPhoto Duplicate Cleaner, Duplifinder and Photodedupo.  I imported a folder of 5 photos into a test library 3 times, allowing iPhoto to import duplicates.  I then ran the 5 photos thru resizer to reduce their jpeg compression but all other aspects of the file the same.
    None of the duplicate removal apps found set that was reduced in the file resizer. That's probably due to the fact that the file creation date was being used as a criteria and the resized photo would have a different file creation date even though the Image Capture date was the same.
    They all found the 3 regular duplicates and some of them would mark two and leave the 3rd unmarked.  iPhoto Duplicate Cleaner can sort the found duplicates by file size but if the file was edited to get the reduced file size it might not be found as it would have a different file creation/modification date. 
    iPhoto Library Manage was able to find all duplicates and mark them as such if the file names were the same the the filename option was selected.  Otherwise it also missed the modified, resized version.  It allowed one to select the one photo to save before going to work on the library.
    So if a photo has been reduced in image quality or pixel size it will not be considered a duplicate.
    OT

  • File SIZES are Different ?

    Hi Experts,
    I have implemented ESR By pass Scenario.
    The problem is we are getting the file sizes are differently in Source folder and Target folder
    How can we resolve this issue
    Please Guide Me
    Thanks and Regards,
    Ravi Teja.

    Refer to question no. 2 under the below wiki and see if that configuration helps.
    Sender File Adapter Frequently Asked Questions - Process Integration - SCN Wiki

  • Is it possible to have same file size for different dpi?

    I changed one.TIFF file (300dpi, 1024X1332)  to .jpg files of four different dpi. But when I checked the four result jpg files, I found out that they are all in same file size and quality.( I also have checked the property of the files in the Windows.)
    I think more DPI means more data and more file size. Am I wrong?
    I use Photoshop CS 5.1(64bit, WINDOWS) - which is part of my Adobe Master Collection CS5.5.
    TIFF(300dpi, 1024X1332) ->
    1. JPG(72dpi, 1024X1332) : 306KB
    2. JPG(108dpi, 1024X1332) : 306KB
    3. JPG(144dpi, 1024X1332) : 306KB
    4. JPG(600dpi, 1024X1332) : 306KB
    I tested a few times more with different files. and same result comes out.(same file size for different dpi)
    Thanks in advance.

    Yes absolutely. Great observation.  PPI does not control the number of pixels, just how big they are.
    Now, if you change the PPI in the Image Size dialog with Resample checked.. then, that is a different story. In that case you will be changing the pixel dimension of an image (i.e.,changing the total number of pixels making up the image) while keeping its print size.
    In your test files, you will notice all the print sizes are different, because all you were telling Photoshop to do was change the size of the pixels (if or when the image is ever printed), which is really just a bit of metadata of the file.

  • Spooling of a query generates different file sizes for different databases

    Please help me regarding a problem with spooling. I spooled a query output to a file from two different database. In both the databases the table structure is the same and the output produced only one row. But the file size is different for the databases. How can this problem occur? Is there any database parameter need to be checked? Both the databases are in same version 10.2.0.1.0.
    before running the spool i did a
    sql> set head off feedback off echo off verify off linesize 10000 pages 0 trims on colsep ' '
    on both the sessions.
    In one database the filesize is *1463 bytes* and on the other the filesize is *4205 bytes*.
    Please help me to find out these discrepancies.

    hi Mario,
    I think you are not getting my point. Both the files contain the same output but their sizes are different. This is due to the no of blank spaces between columns. I wanted to clarify why there is a difference between two filesize when the query output is the same.

  • MP4 & compression & file size, etc.

    I know very little about video exporting and compression other than the most basic steps. My project is creating a long list of 4-10 minute tutorial videos for internet streaming. I have produced one video so far, and at 5 minutes, 1280x800, H264 mp4, it's 65mb. That's way too big.
    After many hours of research and numerous trial exports, I'm not gaining any ground. I just don't have time to spend days/weeks learning this in depth.
    I used a screen capture video program and recorded at the full screen resolution, 1280x800. The initial streaming presentation would be a smaller resolution, but should the viewer choose to go full screen, I would like it to be crisp and clear, so my goal is to maintain 1280x800, or close to it.
    Lowering the quality seems to defeat the purpose as the quality is substantially degraded. My question is, what are my reasonable expectations in terms of file size for a 5 minute mp4 video at 1280x800? Is 65mb about right? I have no idea but if I have 30 videos, and 5 minutes is one of the shorter ones, that could be many many gigabytes. Is there something I am missing how to get smaller file sizes?
    H264 compression was recommended, and tried out several resolutions, having to calculate different multiples of 1280x800 - it all seems very manual and tedious. I just don't know how to proceed since the first video was just so big... Any guidance is greatly appreciated! Thank you.

    I know very little about video exporting and compression other than the most basic steps. My project is creating a long list of 4-10 minute tutorial videos for internet streaming. I have produced one video so far, and at 5 minutes, 1280x800, H264 mp4, it's 65mb. That's way too big.
    Actually, 65 MBs is quite a reasonable file size for a 5-minute, 1280x800, H.264/AAC MP4 file. (File Size = Total Duration X Total Data Rate so your data rate is only on the order of 1.7 Mbps which was originally the limit for 640x480 5th generation iPod files.) In order to make the files smaller you will have to either settle for reduced video quality and/or a smaller display size. (You could, for instance, create a 1024x800 anamorphic encode that displays as a 1280x800 file, but this would only reduce the file size by a small amount whereas a 640x400 non-anamorphic file could cut the file size significantly while retaining similar quality in the smaller display. In short, you need to re-evaluate your streaming/fast start requirements. (I.e., I typically use 2 to 4 times your data rate for what I consider "good quality" 720p24 file encodes for viewing on HD capable devices.)
    After many hours of research and numerous trial exports, I'm not gaining any ground. I just don't have time to spend days/weeks learning this in depth.
    It is unlikely you will be able to further reduce your file size without a loss in video quality and/or a reduction in the file's display dimensions.
    I used a screen capture video program and recorded at the full screen resolution, 1280x800. The initial streaming presentation would be a smaller resolution, but should the viewer choose to go full screen, I would like it to be crisp and clear, so my goal is to maintain 1280x800, or close to it.
    The only way to reduce file size is to either "throw away" data or use a more efficient, higher compression codec. H.264 is about the most scalable, highly efficient, highest compression capable codec you can use which is why it is used for everything from FaceTime to BD/AVCHD encodings.
    Lowering the quality seems to defeat the purpose as the quality is substantially degraded. My question is, what are my reasonable expectations in terms of file size for a 5 minute mp4 video at 1280x800? Is 65mb about right? I have no idea but if I have 30 videos, and 5 minutes is one of the shorter ones, that could be many many gigabytes. Is there something I am missing how to get smaller file sizes?
    "Reasonable epectations" are realative—and may be quite different for each person. Basically, the expectation is reasonable if the file delivers the quality you want at the display dimensions you want in a file size with which you can live. If not, then you have to re-evaluate your "expectations." Further, encoding is driven by the content itself with every file being different and should be treated as such depending on the graphic complexity of the content, the number/type of vector motions involved, ratio of light:dark scenes, display dimensions, etc. There is no "one shoe fits all" here and what is "reasonable" for one file may not be "reasonable" for another. What you are actually missing here is an overall goal strategy.
    H264 compression was recommended, and tried out several resolutions, having to calculate different multiples of 1280x800 - it all seems very manual and tedious. I just don't know how to proceed since the first video was just so big... Any guidance is greatly appreciated!
    You say you are creating tutorial files for internet streaming. If this is your goal, how do you plan to deliver these files in terms of internet connection speeds? The target speed for internet delivery will determine the data rate limits within which you must work. Once you determine the playback data rate limits for target users, you will know the data rate limits at which to encode your file for playback. In turn, this data rate will determine the combination of display dimensions and level of video quality you will have to accept. At this point the size of individual files is of lessor importance because once the file begins to stream or play in the fast "start mode," the file can continue to play as the data continues to stream or download to the end user's local platform/media player. In fact, at this point you can actually decide if you want to create multiple file versions for users having different target internet connection speeds or just a single file. Frankly, until you are able to answer such questions, there is very little advice that anyone can give you.

  • Slow Performance or XDP File size very large

    There have been a few reports of people having slow performance in their forms (tyically for Dynamic forms) or file sizes of XDP files being very large.
    These are the symptoms of a problem with cut and paste in Designer where a Process Instruction (PI) used to control how Designer displays a specific palette is repeated many many times. If you look in your XDP source and see this line repeated more than once then you have the issue:
    The problem has been resolved by applying a style sheet to the XDP and removing the instruction (until now). A patch has been released that will fix the cut and paste issue as well as repair your templates when you open them in a designer with the patch applied.
    Here is a blog entry that describes the patch as well as where to get it.
    http://blogs.adobe.com/livecycle/2009/03/post.html

    My XDP file grow up to 145mb before i decided to see what was actually happening.
    It appears that the LvieCycle Designer ES program sometimes writes alot of redundant data... the same line millions of times over & over again.
    I wrote this small java program which reduced the size up to 111KB !!!!!!!!!!!!!!!!!! (wow what a bug that must have been!!!)
    Here's the sourcecode:
    import java.io.BufferedReader;
    import java.io.BufferedWriter;
    import java.io.FileNotFoundException;
    import java.io.FileReader;
    import java.io.FileWriter;
    import java.io.IOException;
    public class MakeSmaller {
    private static final String DELETE_STRING = "                           <?templateDesigner StyleID aped3?>";
    public static void main(String... args) {
      BufferedReader br = null;
      BufferedWriter bw = null;
      try {
       br = new BufferedReader(new FileReader(args[0]));
       bw = new BufferedWriter(new BufferedWriter(new FileWriter(args[0] + ".small")));
       String line = null;
       boolean firstOccurence = true;
       while((line = br.readLine()) != null) {
        if (line.equals(DELETE_STRING)) {
         if (firstOccurence) {
          bw.write(line + "\n");
          firstOccurence = false;
        } else {
         bw.write(line + "\n");
         firstOccurence = true;
      } catch (FileNotFoundException e) {
       e.printStackTrace();
      } catch (IOException e) {
       e.printStackTrace();
      } finally {
       if (br != null) {
        try {
         br.close();
        } catch (IOException e) {
         e.printStackTrace();
       if (bw != null) {
        try {
         bw.close();
        } catch (IOException e) {
         e.printStackTrace();
    File that gets generated is the same as the xdp file (same location) but gets the extension .small. Just in case something goes wrong the original file is NOT modified as you can see in the source code. And yes Designer REALLY wrote that line like a gazillion times in the .xdp file (shame on the programmers!!)
    You can also see that i also write the first occurrence to the small file just in case its needed...

  • Numbers file sizes very large

    I've noticed that file sizes in Numbers are very large. I was using Excel for a long time and all the files I created were between 20-30KB. The files consisted of an Excel workbook, with 1-2 worksheets in it. The same files in Numbers are 200-300KB and if I save them over the network from another computer to my computer, they jump up to over 1MB. Any ideas on this?

    Hello
    Nothing to do with graphics items.
    An XL file as well as an AppleWorks one is a compiled document in which many components are stored in a very compact shape. One byte was sufficient in AW to represent an operand as they where aboutone hundred.
    In Numbers, everything is described in pure text with complementary delimiters.
    When a formula uses the operand "COUNTBLANK" this one appears with its 10 letters.
    In XL as well as in AW6, a date is stored as a floating number while in Numbers it's stored as the string "mercredi 23 janvier 2008 22:33:19"
    Sama thing for every attributes of every cell.
    So this results in a huge file store in XML format. To spare space, when we close a document, the XML file is packed in .gz format.
    This is the Index.xml.gz file that we may see clicking a Numbers document wit ctrl depressed and selecting the contextual menu item "Show Package's contents".
    Double click onto Index.xml.gz will unpack it.
    giving the expanded Index.xml file.
    I assumes that they are applications dedicated to XML files.
    I don't know them so I just drag and drop the XML file onto a free text editor named Bean which I find really interesting.
    Doing that, we may examine the file's contents.
    If someone knows a correct free application able to open and display correctly the XML files, I'm interested
    Yvan KOENIG (from FRANCE mercredi 23 janvier 2008 22:49:23)

  • MP4 Merge or Join Files Problem. MP4 Maximum File Size?

    I just came back from vacation with my girlfriend, and shot a bunch of home movies with my Canon 5d Mark II. The files created by the camera are MP4's.
    There are a dozen of files in total and each one is around 1 GB at 720p.
    What I want to do now is merge them together into one file. I have Quicktime Pro 7 and basically have been copying them all into one file, but once I try to copy the 8th file/chapter into the cumulative 7 previous files that are now in one continuos file, it doesn't really add anything.
    Is it not working because there is a limit on the file size of the file of an MP4? How can I make sure I merge all the files into one? Any help would be greatly appreciated.

    couldn't figure it out.

  • MP4 Maximum File Size?

    I just ripped my Blu-Ray version of Planet Earth to MP4's.
    Each Chapter was burned as a separate MP4 file.
    There are 10 Chapters in total, so I have 10 separate MP4 files.
    Each file is around 1.78GB at 3700kbps at 720p.
    What I want to do now is merge them together into one file. Then add custom chapters. Anyways, I have Quicktime Pro 7 and basically have been copying them all into one file, but once I try to copy the 7th file/chapter into the cumulative 6 previous files that are now in one continuos file, it doesn't really add anything.
    Is it not working because there is a limit on the file size of the file of an MP4? Any help would be greatly appreciated.

    That's true. I have a similar question that doesn't involve backing up my own Blu-Rays for viewing on Apple TV, sorry about that, I'll create a new thread for that one.

  • Time Machine Backup File Size :  Very Small compared to MAC HD

    I did my first backup and my MAC HD where everything is at is about 319.73 in capacity and Im only using 167.78 storage space. When the backup is complete the file size is only 40Gb , I find that so odd , is that normal will everything restore to its original state If I had to reinstall the OS ?

    OK, here it is
    Starting standard backup
    Attempting to mount network destination using URL: >afp://[email protected]/Data
    Mounted network destination using URL: afp://[email protected]/Data
    Running backup verification
    QUICKCHECK ONLY; FILESYSTEM CLEAN
    CopyHFSMeta: cannot replay journal: Invalid argument
    Backup verification passed!
    QUICKCHECK ONLY; FILESYSTEM CLEAN
    Disk image /Volumes/Data/myname’s Computer (2).sparsebundle mounted at: /Volumes/Time >Machine Backups
    Backing up to: /Volumes/Time Machine Backups/Backups.backupdb
    Detected system migration from: /Volumes/Dom's HD
    No pre-backup thinning needed: 21.02 GB requested (including padding), 889.63 GB available
    Waiting for index to be ready (101)
    Copied 26534 files (22.3 GB) from volume Macintosh HD.
    Event store UUIDs don't match for volume: Macintosh HD
    Starting post-backup thinning
    No post-back up thinning needed: no expired backups exist
    Backup completed successfully.
    Ejected Time Machine disk image.
    Ejected Time Machine network volume.
    So, is the problem that I used Time Machine on my previous machine to backup to a different volume? Then perhaps it has just tried to backup things that may have changed in the couple of months since I last backed up on that machine?
    If so, how do I get time machine to start afresh with my new Time Capsule?

  • Camera Raw 8.1 file size is different

    When I open my Raw Images in Photoshop CS5 Camera Raw, The file size is correct It was shot at 240 ppi.
    But when I open the same Image In Photoshop CS6 Camera Raw 8.1, the Image converts to 300 ppi.
    Off course I can go ahead and open the file Inside the Photoshop and change the Image size when Resample Image Is checked  and change the Resolution to 240 ppi. But I don't want to do that
    I want the Camera Raw 8.1  to open all my Raw Images AS IS, which is 240 ppi. Default size for Canon 5D.
    Thank you.

    First of all, you're confusing resolution (ppi) with size (the weight of the file on disk) and possibly overlooking dimensions (so many pixels wide by so many pixels high).
    In any event what ppi you set in camera is utterly irrelevant to Camera Raw.  ACR will open your file in whatever resolution you choose in your ACR workflow options.  If it opens at 300 ppi that's because that's what you chose in ACR.
    In the ACR workflow options you get to choose the color space it will be converted into, the bit depth, the resolution (ppi) and any resizing you wish to do if possible.

  • Export file size is different from the original raw size

    Could someone explain to why, when the original size of a raw file in a folder is about 11mb. But when in Lightroom is shows about 7.1mb. When I export it to a jpeg the size becomes around 3.5mb.
    I know that sRAW1 file is 7.1mb as stated in the book. But why does it go down to about 3.5mb when exported to jpeg.
    Is there a way to export it to the a higher size? Like 7.1
    I used sRAW1 in a canon 50D.
    Thanks,
    Ray

    The "actual" size of you images file is measured by this basic formula, 8-bits is one byte. Each pixel has one byte for the Red, one byte for the Green, and one byte for the Blue colour channels. So, we have 3 bytes per pixel. Multiply the total number of Mega Pixals of your camera's sensor by 3 and you have the true size of an 8 bit image file.
    How this image will end up in a final file size depends upon the amount of compression you choose to apply.
    The more complex the image, the harder it is to compress without some degradation of image quality. So the original size of an image will not always be a guide to its size when compress. For instance an image with little or no sharpening applied will compress to a much smaller size than the same image with a large amount of sharpening even though the same compression settings were used on export.
    If you wish to resize your images on export from LR in specific megapixals (and therefore MB's) then LR/Mogrify has this functionality.
    There is a nice simple explanation of this here

  • Why is the file size very big when I add captive runtime?

    I have a game that is 2.56 MB. When I package it up for a captive runtime apk it becomes 12.2 MB. This is very reasonable, but my problem is that when I add this apk on my phone through debugging, the size of the game jumps from 12.2 MB to 30.7 MB! I am using Air 3.9 and I don't understand why it is that big, can anyone help me please!

    IPA and APK are .zip files in disguise. Your app is simply unzipping all or a portion of itself when it is installed, depending on OS. For example one of mine is 35MB (APK) and it goes to 50.2MB after install every time.
    The captive runtime has little to do with it besides it's just one more compressed item. If you added any other random ~10MB of assets instead of captive runtime, depending on the content you'd end up with around the same size app anyhow.

Maybe you are looking for

  • LInksys Router BEFW11S4 version 2 and MacBook Pro Snow Leapord

    I bought my Linksys Router Model # BEFW11SW version 2 in 2002. I bought my MacBook Pro Dec.'09. My airport works in public location with wireless networks, but not in my house. Is my router not compatible, or can I re-configure something to make it w

  • How to add the value of the two text boxes in the 3rd text box in ssrs table?

    Hi , in my ssrs table i encountered a situation where i need to computing for the two text box values and i have to show that value in page header. for example i have to add the textbox9 value and textbox12 value then i have to show that value in the

  • WRT54G drops connection

    Hi everybody. I just got a WRT54G router and installed it right away. I use an iMac (wired) and an iBook G4 (wireless). At the beginning it was working well, but after about an hour using it the connection dropped. I restarted the router, after this

  • Transferring data from one custom field to another

    Dear SAP gurus, I have requirement that we create custom field in Shopping Cart and PO, and when creating PO referring to the Shopping Cart the content of the custom field in SC need to be defaulted automatically in PO. For example I create SC 'A' an

  • E-Recruiting Users and their Service Assignments

    Hello all, I am running EREC 604 SP11 (Ehp4), with ECC 6.0 SPS5, TREX 7.1rev39, Portal 7.01. My scenario is EREC+ECC one box, Portal seperate and TREX seperate. I've got everything set up for WebDynpro (as required with this release) and am having so