Split and Compress files

Is it possible to compress and split files via terminal without a 3rd part software?
I know I can compress but I cant find anywhere how to split it to certain size parts.
Furthermore is there any app to help users with terminal commands?

there are plenty of compression tools on the mac. gzip and bzip2.
Lately, I like bzip2. This works a little easier on linux, especially if
the original file is split into a lot of smaller ones.
$ gzip TestFile
$ ll
total 120944
-rw-r--r--@ 1 andya  501  61921000 May 31 22:01 TestFile.gz
$ split -b 10m TestFile.gz
$ openssl dgst -sha256 TestFile.gz xa*
SHA256(TestFile.gz)= cd041d79b4af1a54b524602363a18e67201c4acb03675dfebcae9109d8707367
SHA256(xaa)= a3d803049aee16cbbfd679668164707eb9053488fb2ec5720f282a711ee8c451
SHA256(xab)= 0a79e26c77cb47ec09f5cf68cfa45ea8f52f5157cad07c0ac187eaf0ae59ff79
SHA256(xac)= 0f556e8e93dcb41cb3ab20454ab46c016d6596316d75316d810f45e7c2b3682e
SHA256(xad)= abc3db83737346a8af6ac7ba9552c4b71cf45865f7b9faded54f1683b2afd077
SHA256(xae)= 3afbad7b68a1d1c703865422e40cbd68ca512a652f985a0714258b7d936ad0f6
SHA256(xaf)= 11879853fcfbe6df6fb718e1166d4dcae7e0e6ebd92be6c32c104c0a28f0439a
keep the hash of the smaller file, in case you get a error on the far end of the transfer.
That way you only need to resend the small file that's corrupt.
put the the TestFile back together.
$ cat xa* > ScratchFile
$ openssl dgst -sha256 TestFile.gz ScratchFile
SHA256(TestFile.gz)= cd041d79b4af1a54b524602363a18e67201c4acb03675dfebcae9109d8707367
SHA256(ScratchFile)= cd041d79b4af1a54b524602363a18e67201c4acb03675dfebcae9109d8707367

Similar Messages

  • Import issue - split and compressed dump

    Hi,
    I received 15GB export dump file from site as below, they are splited and compressed
    1. xaa 4gb
    2. xab 4gb
    3. xac 4gb
    4. xad 3gb
    i have import these dump file here in Unix server. i found some document to import the split and compressed dump
    i follow the below steps
    1. copy all 4 files into a directory
    2. then i used commands
    rm -f import_pipe
    mknod import_pipe p
    chmod 666 import_pipe
    (import_pipe file created in the current diriecty)
    nohup cat xaa xab xac xad | uncompress - > import_pipe & ( Process number created like 23901. we need to wait till complete the backgroud process before giving IMPORT command?)
    then i give
    imp userid=<connection string> file=import_pipe full=yes ignore=yes log=dumplog.txt
    then it shows the imp-0009 error...
    pls help me resolve this issue.
    Thanks in advacne

    Pl post details of OS and database versions of the source and target. You will have to contact the source to determine how these files were created. It is quite possible that they were created using the FILESIZE parameter (http://docs.oracle.com/cd/E11882_01/server.112/e22490/original_export.htm#autoId25), in which case the import process can read from these multiple files without you having to further manipulate them.
    HTH
    Srini

  • Quartz, Help, and Compressing Files

    I had a .pdf file that needed compressing. Help sent me to ColorSych and Quartz. Filter presented Reduce File Size but all the options/parameters for that option are grayed out. IOW Help sent me to a dead end. I did the compression by other means but the graying out appears to be a bug. Ain't?

    Open "ColorSync Utility.App" and duplicate the "Reduce File Size" option by clicking on the triangle on the right hand side, then edit the copy. Apple doesn't want us fiddling with their presets!
    Back in Preview, do "Save As...." and select your new settings from the Quartz Filter menu.
    i haven't a clue why it should be so complicated and convoluted, but it works. Pdfs with a lot of bitmap content shrink dramatically.
    You might want to check the quality of the embedded bitmaps for compression artifacts etc.
    cheers
    phil.

  • Compressing files so everyone can open them

    I use a MacBook Pro and compress files (right click, compress) for upload to a site called Teachers pay Teachers. Users who download my files often complain that they can't unzip/un-compress and access and the files.  Some say they are password protected.  I also own Acrobat Pro XI.  I've beed told that files I created with my newer software can't be opened by users with older versions.  I need a foolproof method for compressing files so every person can download the contents.  Any help is appreciated. Thank you.

    Thank you for the reply.  I know I should be able to simply compress in zip format but I have had several bad reviews based solely on the fact that people can't unzip my files compressed and uploaded from my MacBook Pro.  I never had complaints from using my old PC.  I am just frustrated since I really strive for great feedback.  I even created a document to help people http://www.mrsrenz.net/HelpforPurchasers.pdf run their computers.  I believe it is operator error but I am questioning why I only have this trouble with my new software and Mac. 
    I only mentioned Acrobat Pro since it's also a new program and am trying to eliminate the cause. 
    Do you know if I can compress files from a Mac using another program that is foolproof rather than using right click, compress?  I am considering using DropBox and uploading all files from my PC (which is super frustrating).  I don't eaunderstand why users would say my files are password protected.  The company (Teachers pay Teachers) confirmed the issue is not on my end but I want all users to have no trouble using work they've purchsed from me.
    I really appreciate your help.

  • Compress file and split file

    There's this nifty free application on windows called 7-zip that can compress a file and split the file into several parts. I've been searching for a very long time for something like that for mac. The closest I've seen is simplyrar, but it was buggy as it did not allow me to define the file sizes.
    I've been wanting to do this because I have a fat32 drive (simply for mac/windows compatibility) and being fat32 the maximum file size is 4 GB. Thus I wanted to split a larger file so that I can store it on the external drive. Are there any other free alternatives for mac?

    split and concat
    http://www.xs4all.nl/~loekjehe/Split&Concat/
    The Mac Os X command line (terminal.app)
    "split" and "concat" and "gzip"
    To compress, make a compressed disk image of the
    files you want, then split the disk image.
    Kj

  • Silence added to the end of 1 of 2 songs that were "split" and exported as .aif files.

    Hey folks,
    I have 2 songs that run seamlessly (ideally) together. When splitting them apart in garageband and exporting them seperately as .aif files, I find that the first song ends up with about 4 seconds of silence at the end, while the other does not. This obviously affects their ability to run smoothly together in playback. I've tried doing this a couple different times but have only had luck when sending the songs directly to iTunes as a compressed file. Any ideas?
    Thanks

    infantalia11 wrote:
     song ends up with about 4 seconds of silence at the end
    http://www.bulletsandbones.com/GB/GBFAQ.html#exportexactlength
    (Let the page FULLY load. The link to your answer is at the top of your screen)

  • I have a huge file which is in GB and I want to split the video into clip and export each clip individually. Can you please help me how to split and export the videos to computer? It will be of great help!!

    I have a huge file which is in GB and I want to split the video into clip and export each clip individually. Can you please help me how to split and export the videos to computer? It will be of great help!!

    video
    What version of Premiere Elements do you have and on what computer operating system is it running?
    Please review the following workflow.
    ATR Premiere Elements Troubleshooting: PE11: Project Assets Organization for Scene and Highlight Grabs from Collection o…
    But please also determine if your project goal is supported by
    a. format of your source
    and
    b. computer resources
    More later based on details that you will post.
    ATR

  • File History split 1 gig file in to 5 200 meg files, and I can't find how to restore this file

    I setup file restore on a Windows 8 computer.
    Backed up all the files to a NAS.
    Wiped and reloaded 8.1 pro and did a file restore.
    All of my files, except one large file, restored.
    The file that didn't was a 1 gig TrueCrypt (VeraCrypt) file in the root of "documents".
    When I browse out to my NAS, I see 5 files.
    One of them is named: TCvol (2014_12_09 02_17_05 UTC) and it is 200 megs.
    I believe these 5 files need to combine themselves again to restore my original 1 gig file, but I cannot find any documentation.
    Pretty much everything shows that you open file history restore and your data should appear there.

    Hi,
    It should be caused that Windows Server Backup cannot recognize the encrypted file - I cannot confirm how the third party application (TrueCrypt) work, it seems that it will split an encrypted file into several files and Windows Server Backup only restored
    files physically. You can contact TrueCrypt support about this issue. 
    I searched on your website and found several threads which related to get TrueCrypt working with Windows Server Backup. Specifically to your current issue you can confirm with them about any suggestion to get file recovered or how to do such backup job for
    a healthy recovery. 
    https://veracrypt.codeplex.com/discussions/581529
    https://veracrypt.codeplex.com/discussions/577924
    Please Note: Since the web site is not hosted by Microsoft, the link may change without notice. Microsoft does not guarantee the accuracy of this information.
    Please remember to mark the replies as answers if they help and un-mark them if they provide no help. If you have feedback for TechNet Support, contact [email protected]

  • How to split a single file and then route it to different systems

    Hi All,
    I have a requirement like...my incoming file(single file) needs to be split into 3 files based on a field "company code".and then route it into different systems.
    my incoming file is like this .....
    Header,name,.....,.....,........,.....,.....
    Detail,.....,.....,.....,.....,......,companycode(100),....,.....,......
    Detail,.....,.....,.....,.....,......,companycode(101),....,.....,......
    Detail,.....,.....,.....,.....,......,companycode(100),....,.....,......
    Detail,.....,.....,.....,.....,......,companycode(102),....,.....,......
    Detail,.....,.....,.....,.....,......,companycode(101),....,.....,......
    Detail,.....,.....,.....,.....,......,companycode(102),....,.....,......
    EOF.
    I need to split this file and my ouput file should be like this
    For 1st system
    Header,name,.....,.....,........,.....,.....
    Detail,.....,.....,.....,.....,......,companycode(100),....,.....,......
    Detail,.....,.....,.....,.....,......,companycode(100),....,.....,......
    EOF.
    For 2nd system
    Header,name,.....,.....,........,.....,.....
    Detail,.....,.....,.....,.....,......,companycode(101),....,.....,......
    Detail,.....,.....,.....,.....,......,companycode(101),....,.....,......
    EOF.
    For 3rd system
    Header,name,.....,.....,........,.....,.....
    Detail,.....,.....,.....,.....,......,companycode(102),....,.....,......
    Detail,.....,.....,.....,.....,......,companycode(102),....,.....,......
    EOF.
    and then send it three systems based on company code.
    Can anyone tell me how this can be achieved?
    Thanks,
    Hemanth.

    Hi Nallam,
    I tried the same thing but in the input file as there are different company codes,It is not splitting the file into three parts and sending only concerned data to repective system instead the whole file is going to the all the systems.
    I came to know that the file has to splitted in the mapping only, we are able to do it by using mapping but the problem is in Routing,as in receiver determination we makes use of source structure for giving the condition. Can you please help me on this.
    Thanks,
    Hemanth.

  • Split an XML File and invoke a Web Serivce for each splits using an  XSLT

    Hi,
    We Have a requirement to Split an incoming XML File into chunks based on a child element and hit the target Web Service for each and every split ( i.e no of splits = no of hits on WebService).
    Currently, the incoming XML file is getting splitted and gets appended to the SAME Payload and the WebService is hitted only once with the entire Payload.
    Is it possible to invoke a WebService within a XSLT ?
    Please find below the XSLT code used to split the file in chunks of 3
    <xsl:param name="pItemsNumber" select="3"/>
    <xsl:template match="@*|node()">
    <xsl:choose>
    <xsl:when test="count(Batch/Item) > 4">
    <ItemMaintenance>
    <xsl:for-each-group select="Batch/Item"
    group-adjacent="(position()-1) idiv $pItemsNumber">
    <xsl:result-document href="ItemsMatch\{current-grouping-key()}.xml">
    <ItemMaintenance>
    <xsl:copy-of select="current-group()"/>
    </ItemMaintenance>
    </xsl:result-document>
    </xsl:for-each-group>
    </ItemMaintenance>
    </xsl:when>
    <xsl:otherwise>
    <xsl:copy>
    <xsl:apply-templates select="@*|node()"/>
    </xsl:copy>
    </xsl:otherwise>
    </xsl:choose>
    </xsl:template>

    Hello,
    It is possible to invoke a webservice from XSLT. To achieve this you must create a custom XSLT function that does the actual webservice call. You need to write an implementation in Java for this. See my blogpost http://blog.melvinvdkuijl.nl/2010/02/custom-xslt-functions-in-oracle-soa-suite-11g/ for a simple example.
    Regards,
    Melvin

  • The amount of memory used for data is a lot larger than the saved file size why is this and can I get the memory usage down without splitting up the file?

    I end up having to take a lot of high sample rate data for relativily long periods of time. When I save the data it is usually over 100 MB. When I load the data for post-processing though the amount of memory used is excessively higher than the file size. This causes my computer to crash because 1.5 GB is not enough. Is there a way to stop this from happening withoput splitting up the file into smaller files.

    LabVIEW can efficiently handle large files, far beyond 100Mb, provided that care is taken in the coding of the loading/processing routines. Here are several suggestions:
    1) Check out the resources National Instruments has put together (NI Developer Zone > Development Library > Measurement and Automation Software > LabVIEW > Development System > Optimizing Applications > Managing Memory), specifically the article entitled "Managing Large Data Sets in LabVIEW".
    2) Load and process the data in chunks if possible.
    3) Avoid sending the data to front panel indicators, using local/global variables for data storage, or changing data types unless absolutely necessary.
    4) If using LabVIEW 7.1, use the "show buffer" tool to determine when LabVIEW is creating extra
    copies of data in memory.

  • Script to move files and compress

    Hi Guys,
    I am in need of a script that moves log files that are older than 14 days to a compressed folder. I would like to run this as a scheduled task that would run at the end of each week, and for the logs to go into the same compressed folder each week. Is this possible? And if so could anybody please help me? I have spent hours on this and I am really struggling as I dont have much knowlegde of scripting at all.
    Many thanks,
    Jack  

    Ah my Friend we're in Powershell land for that, and very easy it is.
    The compressing part is easy and does not require Powershell as long as the destination File system is NTFS
    Make a folder called Archive (or whatever name suits you) and in Windows Explorer, go into the Properties of that folder, Choose the "Advanced Button" and select "Compress Contents to Save Disk Space" and hit apply.
    Now you should see that folder change color.  Anything going into there is a compressed file.  The great part is you can view it like normal, it just uses far less disk space.   A Zip file might take even less but this is very easy to work with.
    As far as accessing the log files older than 14 days, presuming the Folder is called C:\Logfiles and the filenames end with the extension .LOG
    ---------- Archive old Logfiles - Powershell Script -------------------
    $TODAY=GET-DATE
    GET-CHILDITEM C:\LOGFILES\*.LOG | Where { $_.LastWriteTime.AddDays(14) -lt $TODAY } | MOVE-ITEM C:\LOGFILES\Archive
    ---------- Archive old Logfiles - Powershell Script -------------------
    All this does is look at the directory of files with the extension .LOG and check the last time the file was modified.  Anything older than 14 days is Moved to the Archive folder.   If the "Compression" attribute is enabled, there's your archive process :)
    And of course if you don't have Powershell on the machine in question, it's a quick update from Windows to bring it in :)
    Download Windows Management Framework - http://support.microsoft.com/default.aspx/kb/968929
    Cheers
    Sean
    The Energized Tech
    Powershell. It's so Easy and it's FREE!
    Dive in and use it now, It'll take no time. :)
    http://www.energizedtech.com
    http://www.itprotoronto.ca

  • Why Does idvd take my 26.2gb imovie file and compress it so small?

    Please help. I've created a 17+ minute long imovie file that is 26.2gb. When I view the file information in idvd it says that it's only 0.6gb.
    I've already burned a test copy and it looked way too compressed. Like a bad jpg file. From idvd, I burned 'best quality' to a disc image and toasted it. I know that idvd decides who to do the compression in some complex way, but is there another method to not have so much compression?
    Thanks...

    Robert,
    DV (as normally exported from iMovie to iDVD) runs
    about 13 GB per hour. You say your content is 17
    minutes and 26.GB - what CODEC did you use?
    DVDs are in MPG-2 compressed format. In Best Quality
    mode iDVD will take 120 minutes of DV content and
    compress it to about 4 GB. So 17 minutes of content
    could be about .6 GB.
    F Shippey
    Thanks for your reply. I didn't pick a codec. I just clicked create idvd from imovie and it did it for me. I've made at least 10 other dvds and haven't seen this much loss of quality before. I'm just not sure why...
    0.6 was project size and I guess your math is right, but I've very disappointed with quality.
    Thanks,
    Bob

  • I compressed Activity Monitor and deleted the compressed file in error. How do I get it back?

    In trying to reduce my CPU load, I compressed the Activity Monitor. Before I can use the compressed file, I mistakenly deleted it and now when I try to open it, I get the dreaded spinning wheel. After a long wait, I forced quit.
    What happened and how do I get Activity Monitor back?

    Never tamper with programs which come with the system. You could try downloading and applying the 'combo updater':
    http://support.apple.com/kb/DL1399
    However this won't help if Activity Monitor hasn't been updated since the initial release, which may well be the case. In that event you will need to reinstall Snow Leopard from your original install disks and then apply the combo updater again.
    Alternatively it may be possible to extract Activity Monitor from the installed disk by using Pacifist - no promises about that.

  • Compress files and folders in OS 10.6

    How do I compress files and folders in OS 10.6?

    If you have Compressor, there are many options to compress files.
    If you don't, there ere some QuickTime h,264 options available as well,
    But since you mention folders, I'm guessing that you're not referring to AV files specifically but to zipping generically. If so, right click and choose the Compress  option.

Maybe you are looking for