[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]
Reply to: [list | sender only]
Re: [Imgcif-l] High speed image compression
- To: The Crystallographic Binary File and its imgCIF application to image data <imgcif-l@iucr.org>
- Subject: Re: [Imgcif-l] High speed image compression
- From: Justin Anderson <justin@rayonix.com>
- Date: Thu, 28 Jul 2011 17:36:02 -0500
- In-Reply-To: <CAMkkSyn+uC4VxZpaqAhQb=ENzJYEgj+N5CCs+bPt2-JS+S_otQ@mail.gmail.com>
- Organization: Rayonix, LLC
- References: <4E31AE8C.8040405@rayonix.com><CAMkkSyn+uC4VxZpaqAhQb=ENzJYEgj+N5CCs+bPt2-JS+S_otQ@mail.gmail.com>
Thanks Nicholas. I only made a couple small changes to Graeme's code. 1: to load an image from a file and write to file and 2: to pass the data vectors by reference. The last change seems to have sped things up a little but it's still taking 110 - 130 ms to compress which is too slow. We are not as concerned with decompression speed as that will not need to occur in real-time. I put on our FTP here: ftp://ftp.rayonix.com/pub/del_in_30_days/byte_offset.tgz. Thanks, Justin On 7/28/11 2:06 PM, Nicholas Sauter wrote: > Justin, > > Just some comments based on our experience...first, I haven't tried the > compression extensively, just the decompression. But I've found Graeme's > decompression code to be significantly faster than the CBF library, first > because it is buffer-based instead of file-based, and also because it > hard-codes some assumptions about data depth. > > I'd be happy to examine this in more detail if there is some way to share > your code example... > > Nick > > On Thu, Jul 28, 2011 at 11:46 AM, Justin Anderson<justin@rayonix.com>wrote: > >> Hello all, >> >> I have run Graeme's byte offset code on a 4k x 4k (2 byte depth) Gaussian >> noise image and found it to compress the image in around 150 ms (64-bit >> RHEL, Pentium D 3.46GHz). Using CBF library with byte offset compression, I >> find the compression takes around 125 ms. >> >> This will be too slow to keep up with our high speed CCD cameras. We are >> considering parallelizing the byte offset routine by operating on each line >> of the image individually. Note that this would mean that a given >> compressed image would be stored differently than via the whole image >> algorithm. >> >> Has anyone been thinking about this already or does anyone have any >> thoughts? >> >> Regards, >> >> Justin >> >> -- >> Justin Anderson >> Software Engineer >> Rayonix, LLC >> justin@rayonix.com >> 1880 Oak Ave. #120 >> Evanston, IL, USA 60201 >> PH:+1.847.869.1548 >> FX:+1.847.869.1587 >> >> >> _______________________________________________ >> imgcif-l mailing list >> imgcif-l@iucr.org >> http://scripts.iucr.org/mailman/listinfo/imgcif-l >> >> > >
_______________________________________________ imgcif-l mailing list imgcif-l@iucr.org http://scripts.iucr.org/mailman/listinfo/imgcif-l
Reply to: [list | sender only]
- Follow-Ups:
- Re: [Imgcif-l] High speed image compression (Jonathan WRIGHT)
- Re: [Imgcif-l] High speed image compression (Nicholas Sauter)
- Re: [Imgcif-l] High speed image compression (Herbert J. Bernstein)
- References:
- [Imgcif-l] High speed image compression (Justin Anderson)
- Re: [Imgcif-l] High speed image compression (Nicholas Sauter)
- Prev by Date: Re: [Imgcif-l] High speed image compression
- Next by Date: Re: [Imgcif-l] High speed image compression
- Prev by thread: Re: [Imgcif-l] High speed image compression
- Next by thread: Re: [Imgcif-l] High speed image compression
- Index(es):