Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
Technology

Yet Another New Image Format 146

An anonymous reader sent us a link to a PC World story that talks about a new wavelet based image compression algorithm that (you guessed) produces smaller images at a higher quality. As usual, I'll believe it when IE and NS integrate support for it and webmasters use it. I bet we're still using gifs and jpegs years from now. What a crock.
This discussion has been archived. No new comments can be posted.

Yet Another New Image Format

Comments Filter:
  • It may be better to remain silent and seem a fool than to post and remove all doubt, but I gotta know. What, exactly, is an ROC curve?
  • Wavelet based techniques are much better - less artifacts (no blockiness and other crap), smaller.
    But - most of algorythms are proprietory ;(

    BTW. The example in the post above - with a faint line - is exactly where DWT based techniques shine - they preserve singularity type structure in the data. Look up publications on wavelet denoising.

    Good start page [mathsoft.com]
  • I may be using the term incorrectly, but I take it to mean having different levels of transparency. The GIF format only has simple transparency: either you can see through the GIF completely or not at all for a given pixel. The PNG format has 256 different levels, ranging from fully transparent to fully opaque. The intermediate levels would be nice for blending an image with the background. For example, the background of a web page would show partially through a shadow in another image. Most web design books show examples of a graphic with a drop shadow designed for one color that looks awful on a background of another color. Alpha transparency would reduce that problem dramatically. Unfortunately, I believe that all browsers which currently support PNG graphics convert the alpha values to simple transparency, and rarely very well.
  • Your statement about wavelet compression's effect on ROC curves is not supported by my observations. Can you provide a reference?

    Thanks,

    I.P.
  • there's plugins for NS and IE to read wavelet image for years! even a good friend of mine in university in 1995 make in C a simple compression/decompression program for sound using wavelet... wavelet can be used to compress sound very good too.
    --
  • BeOS/netpositive uses PNG transparencies.

    The OS supports the alpha channel so implementing it in your software isn't so hard.
  • is what it won't work on. There are many types of images where the pixel-to-pixel statistics decorrelate so rapidly that the images are essentially random-like (that is, they have high entropy with respect to the feature of interest). Compressing these types of images with methods like wavelets causes major losses in the ability to retain the faint features that drive the detection statistics. For the technical minded, the ROC (Receiver Operating Characteristic) curve is damaged. That is, the probability of correctly identifying (or even observing) features declines.

    So, prepare to be disappointed.
  • Suppose you have an RGB colored object which is partially transparent (eg, a balloon or colored
    glass) and you want to compress an image of the object and still have it's partial transparency
    represented in the compressed image. You need another component, thus RGBA (A is alpha).

    Alpha is also used to make the background show through non-rectangular objects and to anti-alias
    object edges.

    The reason it is called alpha is that it turns out it's useful to model transparency as a blend
    factor X*(A) + Y*(1-A) so that if you put multiple transparent things overlapping it looks like
    real life.

    This is in contrast to a transparency bit to indicate if the pixel is transparent or opaque.
    Often this is called 1-bit alpha.

    You can simulate alpha transparency by dithering the 1-bit alpha, but it just isn't the same.

  • Unfortunately, you forgot to include the URL for the Analog Devices site, specifically related to the adv6XX chips.

    Here 'tis:

    http://www.analog.com/techsupt/software/lcm.zip
  • If you want to experiment with completely GPL:d image compression algorithm, you can download my GWIC (GNU Wavelet Image Codec) [jole.fi] image compression algorithm. It is certainly alpha quality and the compression performance is not of highest state of art [ucla.edu] (but neither is WI :) ). In fact the compression performance should be somewhat comparable to WI.

    I have not done any development for GWIC lately, because of the lack of interest, but have already almost ready to use new version of it, if someone is interested in integrating it to some other program. GiMP anyone ?

    It is quite easy to add progressivity (in fact the format already supports it, but I have not implemented probressive decompressor), regional focusing (in fact I have already implemented that some years ago in another compression system) or alpha channels. Also I have implemented distortion limiting to the compression (not integrated into GWIC yet), which allows the user to specify the exact quality of the image, not only the quantization or target image size

  • OmniWeb3 (a NEXTSTEP, MacOS X Server and Yellow Box/NT browser) supports it. I just looked at the PNG test page and we apparently fail the "Images with non-square pixels and/or pixels with physical dimensions" test. Guess I'll go fix that :)

    http://www.omnigroup.com/Software

    (and yes, I work for them -- heck, I own them :)

  • Also PNG is not a lossy scheme and the authors said they will never put a lossy scheme into it. This is good, because we want to see ".png" and know it is lossless, and see ".jpeg" (or ".wl") and know it is lossy. Do-everything things like tiff are doomed because we want to know what is going on...
  • by Axe ( 11122 )
    ...they are closer to fractals - zooming property is linked to self-similarity. That's what makes them good for edge detection and compression.
  • Comment removed based on user account deletion
  • Why ?

    Can you tell us why the *HELL* it's useless if it's not open ?!!
  • I'd like to know what kind of knucklehead tries
    to compress an image file of random static. I
    don't know of many algorithms that would handle
    that well... ;)
  • Is the standard open?

    From what I saw on the web page, it didn't seem like I could find out how the compression worked or how to write my own viewer, but maybe I'm wrong.
  • Thank you for the lesson. The argument is so clear that I feel a little disappointed in myself for not being able to think of it on my own. I guess that's the way it is with a lot of things.
  • >A .tar.gz of BMPs would probably be smaller than >a lot of single GIFs

    Two potential shortcomings with this. One, over how many bytes does gzip check to see duplicate patterns? I remember 'ol pkzip for dos, unless otherwise specified, only checked 64k chunks in the name of compression speed.

    Second, tar.gz is made to compress binary files. If you can develop a lossy compression algorithm, one that is right-brain recognizable, then you do not have to worry about being as accurate with the data as gzip would be. This is why gzded BMP's don't compare.
  • it's think that it funny that the people screaming that wavelets are a proprietary non-open source(tm) piece of shit are also the same people creaming in their pants over mp3's, another non-open source(tm) proprietary format. before someone calls me a loony, doesn't the mp3 format have a few patented aspect (ie, proprietory)? didn't the patent holders sue/beat/disembowel the people whom released a gpl mp3 player. if so, then no open-source version exists. funny, downloading a gpl'd program from a warez site.

    i've used wavelets. i'd played with the "voodoo dejavu" technology out of at&t. they are cool. as to this "you need a plug in for them", yes, for now. also, the various formats are incompatable. i remember jpeg also going through the same thing. cshow and gws each couldn't reach some jpeg files that the other could read. then, the standards committe put down their foot and we finally had one format. wavelets will be the same way. maybe jpeg will incorporate wavelets in the v2000 spec?

    also, wavelets are much better at video than mpeg/motion jpeg/avi files. this is one of the things we're dinking with right now. you can re-wind them, whereas you cannot rewind the *peg's! avi can possibly rewound, but the file is huge (and proprietary). wavelets are smaller than all the other.

    is our dinking around using a proprietary format? yes. once some committee standardizes on something, we'll change our format to that. mpeg-4/5 would be cool if it had this.
  • Posted by Mr. Assembly:

    Stop askng if is open - It's probably patented. My guess is that it could lead to the hi-jacking of a format lke what happenned to GIF a few years back.
    People probably would not want to hassle with another plugin anyway.
  • Most, if not all, of the Amiga WEB browsers have supported PNG for quite some time now. It's about time people switched.

    However, lets see what WI is like - maybe we'll be slating PNG next year...
  • From compresion 101: This is always true by what is known as the counting argument.

    Suppose we have a 2:1 compressor that takes 1K images down to 512byte images. There are a
    total of 256^1024 possible images, and we only have 256^512 representable (assuming our compressor
    is perfectly efficient). That leaves about 256^1024 - 256^512 ~ 256^1024 possible images not representable
    which is almost the same amount you started with.

    This means you pretty much have all the possible images to form your interesting class of images that kill your compression algorithm.

    Compression is easy, you just have to know which images you don't care about (which unfortunatly is
    the hard part).
  • This looks very cool, especially for images and video. What would it take to use the same mathematic priciples and to have an OSS codec?

    Any OSS codec would kill a proprietery standard due to it soon being ported to greater platforms than just Netscape / IE under Wintel.

    Codecs are commodaties, and the sooner the world realises this the better.

    Ice Tiger
  • So the neet fact that the image is smaller is basically crushed by the time/space/bandwidth it takes to install proprietary extensions to decode said "small" image. Foo.
  • a lot of the usernames are the same. same writing style for some of the ACs. hell, mr. taco rips on wavelets and drools over mp3s.
  • Just one small problem. Compression requires a Beowulf cluster running overnight ...
  • by Anonymous Coward
    Well, It all depends on the filter pair used to do the wavelet decomposition. Wavelet compression's ultimate goal is to remove coefficients that represent details. If a filter pair is chosen that captures some details in the low-pass portion (approximation portion), the result will look very reasonable for the output size, however, the filters that are less mathematically intensive (faster and easier to implement) generally destroy detail and look like crap.
  • Here's an idea that I haven't heard of yet to help compress still images.

    What if images came in packs, and after being compressed (using an algorithm :), I haven't had any experience with this yet) they were compressed to find patterns against each other as well? This would be perfect for webpages which show more than one image at a time anyway.

    I understand mpeg does something like this, but that adds the fourth dimension (time, there is no third dimension here :) )
  • FAST Image Transfer, it's made by the same guys that gave us FAST FTP search and FAST MP3 search!

    Take a look: http://web.fast.no/product/imagetransfer/det.asp?i d=44
  • Greater compression on images only addresses bandwidth issues and completely neglects latency.

    We already have JPEG, and and if anything replaces GIF it will be PNG (the sooner the better!). PNG handles "solid" colors well, like GIF, but its an open standard like JPEG is, and supports 8 and 16-bit alpha... rather than the 1-bit alpha "on" and "off" found in GIF 89.

    HTTP servers typically start a transfer off relatively slowly, allocating more bandwidth as a file progresses. So unless your image files are really large the server is never reaching potential because it finishes the xfer before increasing the bandwidth. IOW, servers do better with 1 100k file than with 10 10k files.

    If the images could be bundled into a container format, like how some Java applets use .JAR, then pages would appear to download more smoothly, especially ones with lots of small images.

    Another big waste is "localized" websites that are not at all local. It would help everyone if sites like "Yahoo Boston" were ACTUALLY located *IN* Boston so you weren't dragging a page across the country. It's wasteful. Maintaining a remote webserver is very easy to do if you use UNIX (Yahoo uses BSD..).

    Of course, Microsoft's Sidewalk sites can't be remotely maintained because they run on NT [network farms...]. LOL... I wonder what MS' IT budget must be, aside from the fact that they are exempt from hundred thousand dollar NT server licenses... :-D

    Nothing will do more to help the web than local caching.
  • He specifically said "either have incredible loss or a negative compression ratio" (that is, the file gets bigger), which is the case with LZW on purely random data (the best you can hope for is to almost break even).

    ---
  • LZW (which I think is the heart of the UNIX compress command -- pls don't flame me if I'm wrong) would give lossless compression, if it compressed. But, the "compressed" file is as large as the input file in this case.

    All this is sort of an informal way of saying that the degrees of freedom needed by the compression algorithms is equal to the number of pixels in the image. Hence, no compression.
  • Sorry, I couldn't resist.

    We've occassionally used the making of foie gras as an example for this topic. You stuff alot in and hope that the little bit that comes out is edible.

    This not a case of lossless compression, however. The result is vastly different from the input. But, you're sure willing to live with it!
  • No.

    Summus owns a stack of patents on wavelet codecs, and they're royal bastards about "their IP". In fact, this is one of the most patented areas of mathematics. Even though you can't patent mathematical formulae under US law...

    Wavelets, especially adaptive wavelet coding, is vastly superiour to DCT (the basis for JPEG and MPEG), especially if you do video codecs using 3D wavelet transforms. Why haven't they taken over? Because it's a patent minefield.

    I saw this stuff three years ago - it looks nice, and it really does work, but it's not remotely open. The JPEG2000 standard will likely be no more open and free than GIF or MP3, with their associated patents.
  • The viewer can be found of off Summus's site:

    To download:
    http://www.summus.com/products/download/download .html

    Click on ActiveX or Netscape to take a look at there image gallery.

    Amoeba [remotepoint.com]
  • I do know about these Summus people though.
  • marketing@summus.com

    I think that's enough to change my mind. I don't want to give them an excuse to spam me every time they release an update to their ActiveX control. Ugh.

  • oops, bye-bye features.
  • I guess I can buy a multifractal view of wavelets.

    But, as I've noted in my comments elsewhere, there are many interesting classes of images where one might use (ugh) fractals as some sort of descriptor, but wavelet based compression doesn't work.
  • by Anonymous Coward
    Wavelet compression has been around for quite a while. It is also used in digital image processing. I know that doesn't answer the question. But if WI is going to be a proprietary format, it wouldn't take much to create an open source format that uses Wavelet.

    As I understand it, Wavelet is the same as FFT except the basis function is different.
  • by Axe ( 11122 )
    ..out of the box compression will drop it.
    Properly selected one will not, and will be more efficient than Fourier, MA or other denoising technique. It is also very efficient for automating your analysis. Think machine vision.

    I use wavelet transform too search for some features in time series data. Works excellent.
    Unlike windowed FT preserve important singularities.

    WT is a broad subject, and it seemed to me you picked up one particular implementation that is not up to your goal. E-mail me if you want to discuss this problem, I am always interested in new applications :)
  • I have been studying wavelet compression for a number of years now.

    It has been well proven that well coded wavelet transforms can be made to be more accurate than the DCT (Discrete Cosine Transforms) that are at the heart of both the JPEG and GIF formats.

    The problem is, most of these formats are still at the academic stage of coding, have been licensed into extinction, and in general, don't offer the rest of the world much of a reason to come on board.

    After spending a few minutes perusing the Summus Website and their technology, only one item struck me as being worthwhile -- the idea of focus regions, i.e., areas where less compression can be used to maintain higher fidelity to the original image.

    Two things I would like to see:

    1. this regioning technique incorporated into PNG, JPEG, etc.
    2. a fast, Open Source wavelet transform which all of us Linux coders can put through the grinder until it is as worthwhile as JPEG and GIF.

    I am willing to put up the server space for an OSS wavelet project if enough /.'ers are interested.

    Feel free to send an inquiry, but make sure you indicate both your coding and wavelet experience in the body of the e-mail; items without both will simply be trashed.
  • I used to run one of those silly free Web graphics sites. I tried to put PNGs on it instead of GIF but all the sillies didn't like it because they didn't know what a PNG was. I also tried to release my graphics under the GPL (I know, it's for software, but what the hey) and I got flame mail from those copyright-nut artists. Argh, I guess I can't win.
  • Wavelets are hardly new - the idea has been around for at least 3 years. I don't recall where I first heard about them (Scientific American?) but AT&T had a large interest a couple of years ago. In fact, a quick search for "wavelet" on altavista turned up wavelet.org, which is sponsored by Lucent Technologies. Other pages turned up included papers written about wavelets, and wavelet related source code. As I write this, I am downloading a package called wavelet.0.3.tar.gz - it is essentially a wavelet construction kit (grayscale only). The file is dated 1/29/97. It's about 700k, and there is no copyright notice / license on the page [dartmouth.edu] I'm downloading it from, nor in the source code.

    So what we have is an old image format which hasn't caught on yet. I find it hard to believe that a community like the one here at Slashdot has never heard of this before, much less played with the freely available source code.
  • PNG is a lossless compression format. It's similar to GIF, but patentless and can handle more than 256 colours (among other things). This makes it good for non-natural images, especially those that suffer under the hands of lossy compression formats like JPEG. However, with lossless compression, your're seldom gonna get the same kind of compression ratio, simply because (rather simply put) lossy compression algorithms throw away a lot of the data, so they don't actually have so much to compress.

    So, I suggest in the future before you start insulting the /. readership in general, you use your brain and make sure you do actually know a little about what you're talking about.

    cheers,

    Tim
  • > Have you tried to surf todays web using
    > Netscape or IE 2? Half the web sites do not
    > work, and a lot of the ones that do look
    > terrible. People will upgrade their browsers if
    > the web becomes unworkable with their current
    > one.

    That's just the thing, they don't. *You* know that the web isn't supposed to look like that, but they don't... Not to mention, there are plenty of people like my mom who have a 386 just for word an a little browsing to check the weather... try putting Netscape 4.x or the IE4 beastie on it. Watch it crawl.

    > How many people do you see still using Mosaic
    > because they are affraid Netscape is buggy?

    Do you run a web site that gets a decent amount of hits? Check your logs, you'd be amazed.

    :)
  • There is still plenty room at the bottom, guys. Since the JPG format cuts up an image in 8x8 pixel blocks and compresses these individually, it should be possible to compress one block a bit more than the other. This can be used to implement regional focus, possibly resulting in significant space gains on many pictures while staying strictly JPG. I didn't experiment with this possibility (yet) so I don't know how it will turn out in real life.

    Anyone care to make a nice GIMP plug-in for this?

  • by chaotic ( 8538 )
    No compression algorithm will work on all images. And, for any given method, there are many types of important images that can't be compressed.
  • is that they are proprietary. They can't ever hope to make any real money off that. Who in their right minds would use a WI image on thier web site.

    Nuff said.
    --
  • Hey: that perfectly describes the JPEG software I have. Mind you, not _all_ jpeg will do that- I'm talking Boxtop Software 'ProJPEG'. It takes noticably longer than generic JPEG to run, because it's doing things like (if I remember correctly) optimizing huffman coding of whatever the hell it's doing... *boggle* at any rate, the 'curve' of this is striking. It's not that much greater than, say, Photoshop's version, for super high quality that's supposed to look lossless. Still looks like a jpeg, maybe a bit smaller. However, when you start getting ruthless, look out! I think my limit was nearer to 400:1. This looks ugly and it's more usable to go with between 30:1 and 90:1...

    Examples? Easy: my art pages (which also include a bunch of linux tiles and titlebars in XPM) have background pictures that are JPEG. They are all 1024x768 and are around 100K in size. They are here [airwindows.com]. (And I should give my mac/linux dualboot box up and start doing everything in Windows for what, exactly, mister proprietary compression vendor sir? Feh)
  • wavelet compression has many advantages over jpg:

    no 8x8 tiles, progressive loading (with only few percent of the whole data astonishingly good picture quality is possible)

    wavelets are very well suited for compressing natural photographs (I have seen some demos)
  • We probably will be using jpeg and gif for quite a while. I remember the first time I heard about png, it was supposed to be this revolutionary format that would be used on the web and everywhere. And now where is png? Well, it's gnome's preferred gfx format, if that counts for anything :)

    MoNsTeR
  • IE4 does not support png in all installations - it isn't implemented on IE 4.5 for Macs. I know, whoop de doo - but at least attempt to be correct when making a statement like that.


    Re:
    http://www.hotwired.com/webmonkey/99/09/index0a_pa ge3.html?tw=graphics_fonts [hotwired.com]

    - Jeff
  • doesn't everyone here remembering hearing about this from either slashdot or freshmeat...
    this is at least 6mos old .. i'm thinking it is more like one year though..
  • Maybe a few hundred pieces of email asking the same quetion will pound them into submission. :-) If anybody can get through to their web site [summus.com], post an address.
  • Common, who with any amount of brains is gonna use a proprietary format, let alone a proprietary IMAGE format. Yeah, how well is this thing going to last, and how well will it ever be supported. It's common law that proprietary formats always die. There is of course some exeptions, like the M$ Word format, but that's only used a.) because Word has the monopoly on word processors, and b.) also intelligent people only use it (forgetting the fact that an intelligent person would use LaTeX :) for inside there own editor, not for when they want large numbers of people to see it. No large company has a monopoly on image formats, Free Software has a stake in it, many smaller companies do, and so do large companies like Adobe, Macromedia (i really hate proprietary software, but that company comes out with some really awesome software), M$, and Netscape.
  • How do I use Netscape's native support? On my WinNT (ritual spit towards Redmond) box I installed Apple's Quicktime 3 plugin, which takes over the PNG format (and a boatload of others, as well). I'm running Netscape 4.5. How to I tell it to disable the QT plugin for PNGs and render them natively? This is somewhat important to me, as I'm doing some web development with PNGs and I want to see how Netscape handles them. My other recourse is to get another machine in here with a "clean" install.

    I looked at the "Applications" entry in the Preferences dialog. "PNG" is listed as being supported by a plugin, but the "Edit" and "Removed" buttons are greyed out. I can't find a way to change it.

    An oddity in the Linux (hail Linus!) version of Netscape 4.5: I have no plugins at all installed, and the "Applications" entry in Preferences tells me "PNG file -- Unknown: Prompt User". Netscape does render PNGs, though, even though it claims not to. But it seems that PNG is not its preferred format. On the PNG home page [cdrom.com] the PNG logo is given in an OBJECT tag, along with a GIF. Netscape renders the GIF instead of the PNG.
  • This is an interesting metric. Hmm, is there a reference site that you recommend for comparing
    different lossy compression schemes using ROC?
  • OK, you buy a sense of humor at K-mart, now ...
  • yup, it's all in the model. And, that's the hard part. Wavelets, et al., are too generalized for many classes of interesting images. They may work well on some classes, as you note.
  • It lets the innovative slip in under the "radar" of the entrenched constituencies. By the time the "old boys" figure it out, they're history (or we at least have enough of their market to make us happy).

    Besides, a friend of mine, one of those euphamistic "high government officials" (in the sense of high grade, not in the Bill Clinton sense of not inhaling Monica) once noted that "if you have no enemies, you haven't done your job."

    So, lets have more NIH from all the back room boys! Meanwhile, we'll be demonstrating working systems to your customers.
  • Posted by Charles Bronson:

    According to Summus' page, the Photoshop Plugin alone costs $150. I see this format going... nowhere. Surprise.

    See for yourself at: http://summus.com/products/4u2c/photoshop/4u2c_pho toshop.html

    "Summus' 4U2C(TM) Adobe PhotoShop Plug-in extends Adobe PhotoShop's file formats to include 4U2C(TM) Image Compression. This Plug-in allows PhotoShop users to view Summus Wavelet Images and convert other image formats to Summus Wavelet Images. Summus Wavelet Images file size is controlled by File Size, Compression Ratio or Image Quality.

    Price (US and Canada only)

    $149.00 + S&H ( SC residents add 5% sales tax )
    For International Pricing call or email"
  • Netscape supports PNG, and I believe that IE does as well. I've seen web pages with narry a GIF in sight, all images having been converted to PNG.
  • What I'd like to know about this format:

    1. Lossy or lossless? (I'm guessing lossy, but would like to know for sure.)
    2. How many colors?
    3. Does it support alpha transparency?
    4. Open or Proprietary? (The big one)

    Of course, I'm still waiting on proper support for PNG [hotwired.com] graphics.
  • Wavelet compression is not entirely unrelated to fractal compression. Like the fractal approach the artefacts are more in sympathy with the way the brain works than are the artefacts of 2D Fourier based approaches, like JPEG. It is not, therefore, entirely incorrect to say that files are both smaller and better. You can make the wavelet file smaller than a JPEG file before a mathematical measurement says the loss is similar. However, when you look at the results the wavelet picture still looks considerably better, as the artefacts are more in sympathy with the way our eyes work.

    Wavelet and fractal compressors have been around for at least a decade, for audio and other data as well as for images. They have previously been highly asymmetric in their compute requirements - they need huge compute power for compression, but very little for decompression. This has previously made them impractical for applications like mini-DV cam-corders - the key reason MPEG-2 is still 2-D Fourier based. Tricks for getting the compute requirements down are now coming through. The next MPEG standard may change to wavelets, and I understand the next JPEG standard definitely will.
  • Discrete Wavelet Transforms are chosen as the transform for JPEG 2000, so we will get the best of both worlds.
    /juels
  • You don't give a rat's rectum now, but you'll be bitching like hell when your pointy-haired-Dilbert-boss comes to you and says, "We've opened a branch office in Podunk, and they need to do e-commerce. Put that catalog you've been working on online."

    You *should* give a flying fluck about web graphics, because it all seems to be converging. It's no longer the technology that leaves boys typing with one hand-- it's the tech that my mom just ordered a car with.



  • Okay, I'll use all PNGs on my next web project. Then I'll go cash in my unemployment the next week. I'd love to use PNG, but it's kinda hard to go to a client who's hired my company for a project and explain: "Well, no, your audience will probably be limited to the techie 5% of the Internet population, but it's the Right Thing to Do."

    It's just not that simple. I really wish it were. If it were, I'd have nary a speck of WinNT running in my local world.
  • Make an image of random numbers. Put a faint line in it.

    Now, compress that image with JPEG, LZW, or wavelets.

    Upon decompression -- the line is either gone or you get a negative compression ratio.

    Sound far fetched? The image described above is a good proxy for things like radar images of the ocean.
  • Actually, PNG is the internal image format used
    by MS Office. It is not widely advertised, but
    acknowleged. libpng (reference implementation
    is available in source) The only issue I have
    with it is that it relies heavily on FP hardware.
    A truly portable implementation should work on
    integer-only hardware.
  • i did a little bit of web research and find that their is stuff about wavelet dating back to 1990...
    i also find www.wavelet.org :)
    on there second newsletter they give c++ source for the compression codec...
  • I create 2D and 3D images and animations for a living. PNG really saves space on my computer by allowing me gif-type compression at 24-bit color. This format has been used in the 3D world for several years now... I expect it to replace GIF sometime in the near future.
  • Naturally, this will depend severely on things like the size of the images.

    I tried this experiment with 2400+ small bmps we have here (average size, 1k or so). The tar.gz was about 1/20th the size of the directory of GIFs. This won't always be the case, I'd guess. Since the images were so small, the GIF directory probably had a higher proportion of redundant info (ie, 2400+ headers) than a normal set of images.
  • I'm sitting on a 256K adsl connection and I'll say this much... It's Nice for downloads.
    I have no problems with webpages loading, as even the largest pages load within 10 seconds or so (a rarity to take that long)
    PNG support works just fine under Netscape 4.8 for Linux, though I must admit I don't run into many pages that use it.

    Jpeg is great.. it works for the web... No.. I don't like using a 56K connection anymore.. but hey.. like you said... connections are getting bigger all the time.

    If you need HIGH quality compressed images... chances are you will be using a proprietary format anyway. *shrug*

    And as far as Wave files go... With the technology coming soon to a desktop near you... wouldn't you rather see realtime recording of MP3?
    Why waste space when you don't have to?

    One question... do you REALLY want to lose 100Gigs of data in ONE hd failure? hehehehe.


    just ramblin
  • For the PhotoShop plugin. Not worth it, especially since the web browser either needs a plugin or an active-x component.

    I'll stick to JPG & PNG, Corel PhotoPaint supports it natively, it can be viewed on Linux, Windows, Apple, etc. I refuse to use an image format on my website that would make it readable only by windows users.
  • if your mom ordered a car only having seen a picture you should both be shot

    Nope, not quite. You look at the pictures, then order someone to come by for a test drive. What, you think those auto sites just UPS it to you?

  • by Anonymous Coward
    It might surprise everyone to know that wavelet compression may be coming to a browser near you sooner than you think. The next version of JPEG (JPEG 2000) will support wavelet compression, so assumedly this will also be incorporated into browsers.

    Regarding the quality and use of wavelet compression: This type of compression is nothing new. Wavelet compression has been used for graphics and audio for some time now. While different wavelet algorithms perform with varying degrees of quality, a good algorithm will provide much higher quality and better compression than JPEG. Why? Read some technical papers on it - I'm not about to explain it here. I personally have seen it used not just for web browsers, but for field applications such as compressing medical images (CT/MR scans, etc.) Compression ratios of 100/1 are sometimes achieveable with little loss. So, no, it is not a waste of time or a fad.

    I'll admit, it will be several years perhaps before JPEG 2000 is fully supported by all browsers, but if JPEG continues to be the standard, we will have wavelet compression widely used in a variety of applications.

    Those of you who are naysayers, do a bit of research on JPEG 2000 and you may change your mind. Otherwise, don't complain.
  • Most problems with ROC can be fixed with a compander or in the general case, histogram
    renormalization.

    Although I personally think wavelets are a lost cause in the generic image compression arena,
    people have used them successfully in specific areas where the data has known characteristics
    which are supported by models (e.g., fingerprints, synthetic aperature sub-millimeter radar).
    Wavelets also have many uses in analysis.

    The DCT and most wavelet transforms are perfectly able to represent any images since they are
    non-singular transforms. The compression artifacts are not because of the basis
    functions, but the quantization of the coefficients. Most quantization algorthims are
    naive so they produce naive results.

    That's why I don't think wavelets have a chance. Wavelet people are so wound up in producing better
    wavelet functions, that they end up ignoring quantization and companding improvements where
    the DCT guys stopped playing around with the basis functions a long time ago and are years ahead in
    the quantization, companding, and entropy coding areas.

    BTW: It is amazing how people dis algorithms using the NIH (not invented here) metric.
  • The only thing that's going to change what gets used is people actually changing the formats. If you don't want to see more gif's, use png's for all your pages. Most browsers support them (well, IE 4, Netscape 4 and Mozilla, plus any browser that uses external image softwear.)

We are not a clone.

Working...