Preferred, first find an open source C # compression component.
Such as: ICSHARPCODE.SHARPZIPLIB Download Ues: http://www.icsharpcode.net/opensource/sharpziplib/default.aspx
Depending on its help you can do what you need.
I am using this component, I have encountered a problem.
There is no error when compressed small files. Once the source file reaches 150m, it will let your machine collapsed. (At least my machine)
Why is this, because if the source file is 150M, you need to apply for a 150M size byte array in memory. A better machine is still no problem, and the general machine can be miserable. If the file is in a big, the good machine can't stand it.
In order to solve the problem of large file compression, the method of segmentation compression can be used.
Private string createzipfile (String Path, Int M)
{
Try
{
CRC32 CRC = New CRC32 ();
Icsharpcode.sharpziplib.zip.zipoutputstream zipout = new icsharpcode.sharpziplib.zip.zipoutputStream (System.IO.File.create (Path "));
System.io.filestream fs = system.io.file.Openread (path);
Long Pai = 1024 * 1024 * m; // Write once per m
Long forint = fs.length / pai 1;
BYTE [] Buffer = NULL;
ZIPENTRY Entry = new zipentry (system.io.path.getfilename (path));
Entry.size = fs.length;
Entry.Datetime = datetime.now;
Zipout.putNextentry (Entry);
For (long i = 1; i <= forint; i )
{
IF (PAI * i { Buffer = new byte [PAI]; Fs.seek (Pai * (i-1), system.io.seekorigin.begin; } Else { IF (fs.length { Buffer = new byte [fs.length]; } Else { Buffer = new byte [fs.length-pai * (i-1)]; Fs.seek (Pai * (i-1), system.io.seekorigin.begin; } } fs.read (buffer, 0, buffer.length); CRC.Reset (); CRC.UPDATE (BUFFER); Zipout.write (Buffer, 0, Buffer.Length); Zipout.flush (); } fs.close (); Zipout.finish (); Zipout.close (); System.io.file.delete (path); Return Path "Zip"; } Catch (Exception EX) { String str = ex. measureage; Return Path; } }