Results 1 to 3 of 3

Thread: Papers 2011

  1. #1
    Member BetaTester's Avatar
    Join Date
    Dec 2010
    Thanked 3 Times in 3 Posts

    Post Papers 2011

    Adoption of Lossy image for the purpose of clinical interpretation

    Scalar Quantization for Relative Error

    Collaboration in Distributed Hypothesis Testing with Quantized Prior Probabilities

    Reliable information embedding for image/video in the presence of lossy compression n=search&_zone=rslt_list_item&_coverDate=01%2F01%2 F2011&_sk=999999999&wchp=dGLbVlz-zSkWA&md5=08c54ecd005fbba9acbaeffac1c5968b&ie=/sdarticle.pdf

    Floating-Point Data Compression at 75 Gb/s on a GPU
    This paper investigates whether GPUs are powerful enough to make real-time data compression and decompression possible in such environ-
    ments, that is, whether they can operate at the 32- or 40-Gb/s throughput of emerging network cards. The fastest parallel CPU-
    based floating-point data compression algorithm operates below 20 Gb/s on eight Xeon cores, which is significantly slower than
    the network speed and thus insufficient for compression to be practical in high-end networks. As a remedy, we have created the
    highly parallel GFC compression algorithm for double-precision floating-point data. This algorithm is specifically designed for
    GPUs. It compresses at a minimum of 75 Gb/s, decompresses at 90 Gb/s and above, and can therefore improve internode commu-
    nication throughput on current and upcoming networks by fully saturating the interconnection links with compressed data.

    An unsupervised learning quantiser design for image compression in the wavelet domain using statistical modelling =10&m=or

    Image compression using adaptive lifting scheme based on minimum mean square error criterion =10&m=or

    A new lossless chain code compression scheme based on substitution =10&m=or

    Comparative Study of Arithmetic and Huffman Data Compression Techniques for Koblitz Curve Cryptography

    Speckle Noise Reduction of Medical Ultrasound Images using Bayesshrink Wavelet Threshold

    SAR Image Compression using SPIHT Algorithm

    Image based Secret Communication using Double Compression

    An Efficient Text Compression for Massive Volume of Data
    To propose a new text compression technique for ASCII texts for the purpose of obtaining good performance on various document sizes. This algorithm is composed of two stages. In the first stage, the input strings are converted into the dictionary based compression. In the second stage, the redundancy of the dictionary based compression is reduced by Burrows wheeler transforms and Run length coding. The algorithm has good compression ratio and reduces bit rate to execute the text with increase in the speed.

    An Enhanced Vector Quantization Method for Image Compression with Modified Fuzzy Possibilistic C-Means using Repulsion

    Implementation of an Improved Watershed Algorithm in a Virtex 5 Platform

    A Novel Approach for Reduction of Huffman Cost Table in Image Compression
    In this paper a new methodology has been proposed for the reduction of the cost table for the image compression using Huffman coding Technique. Compared with the traditional Huffman coding the proposed method yields the best results in the generation of cost tables. The advantages of new binary Huffman table are that the space requirement and time required to transmit the image is reduced significantly.

    Improved Adaptive Block Truncation Coding for Image Compression

    Superior SOM Neural Network based Minute Significant Watermark Generator and Detector System

    Novel K-means Algorithm for Compressing Images

    An Efficient Hybrid Image Compression Scheme based on Correlation of Pixels for Storage and Transmission of Images

    Analysis of Don?t Care Bit Filling Techniques for Optimization of Compression and Scan Power
    Test power and test time have been the major issues for current scenario of VLSI testing. The test data compression is the well known method used to reduce the test time. The don?t care bit filling method can be used for effective test data compression as well as reduction in scan power. In this paper we describe the algorithm for don?t care assignment like MT(Minimum Transition)-fill technique and hamming distance based technique.The selective Huffman ,optimal Huffman and modified selective Huffman coding are applied on the mapping set to give the optimum Compression and weighted transition matrix is used for scan power Using these techniques find compression and scan power parameters like average power and peak power and conclude that MT- fill technique gives low peak and average powers and Hamming distance based modified selective Huffman coding technique gives higher compression ratio compare to another methods like selective and optimal Huffman coding.

    H.264 based Selective Video Encryption for Mobile Applications

    Color Image Compression using SPIHT Algorithm
    In this R, G and B component of color image are converted to YCbCr before wavelet transform is applied. Y is luminance component; Cb and Cr are chrominance components of the image. Lena color image is taken for analysis purpose. Image is compressed for different bits per pixel by changing level of wavelet decomposition.

    Performance Analysis of InterpolatedShrink Method in Image De-Noising

    Motion Adaptive Compensation Approach for Deinterlacing of Video Sequences

    A Review of Region-of-Interest Coding Techniques of JPEG2000
    JPEG2000 provides many different ROI coding mechanisms, namely the general scaling method, max-shift method, bitplane-by-bitplane shift method (BbBShift), partial significant bit-plane shift method (PSBShift) and ROITCOP (ROI coding through component priority) method. These method have advantages and disadvantages relative to one another and the choice of methods for ROI coding is very much dependant on the requirements of the application at hand.

    Comparative Study of Arithmetic and Huffman Data Compression Techniques for Koblitz Curve Cryptography

    Lossless Text Compression using Dictionaries
    The algorithm suggested here uses the dynamic dictionary created at run-time and is also suitable for searching the phrases from the compressed file.

    Still Image Compression by Combining EZW Encoding with Huffman Encoder

    A High Throughput Algorithm for Data Encryption

    Automated Multiple Related Documents Summarization via Jaccard's Coefficient

    Frame permutation quantization

    Usability of irreversible image compression in radiological imaging. A position paper by the European Society of Radiology (ESR)
    Last edited by BetaTester; 4th June 2011 at 05:19.

  2. #2
    Administrator Shelwien's Avatar
    Join Date
    May 2008
    Kharkov, Ukraine
    Thanked 1,369 Times in 783 Posts
    Thanks, especially for reminding about DCC

  3. #3
    Member bello's Avatar
    Join Date
    Dec 2014
    Thanked 0 Times in 0 Posts
    Thanks alot for this list of download...

Similar Threads

  1. Windows 7 SP1 (Feb-22-2011)
    By Raymond_NGhM in forum The Off-Topic Lounge
    Replies: 0
    Last Post: 24th February 2011, 09:52
  2. EMILCONT Compression Spreadsheet Updated 2011
    By emilcont in forum Data Compression
    Replies: 5
    Last Post: 29th January 2011, 13:30
  3. Intel Parallel Studio 2011
    By VoLT in forum The Off-Topic Lounge
    Replies: 0
    Last Post: 16th October 2010, 11:20

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts