人人文库网 > 毕业设计 > 文献翻译 > 外文翻译--提取颜色特征和动态匹配的图像数据库检索【PDF+WORD】【中文9300字】【中英文文献译文】
外文翻译--提取颜色特征和动态匹配的图像数据库检索【PDF+WORD】【中文9300字】【中英文文献译文】
收藏
资源目录
压缩包内文档预览:(预览前20页/共32页)
编号:30435544
类型:共享资源
大小:2.62MB
格式:ZIP
上传时间:2019-12-12
上传人:好资料QQ****51605
认证信息
个人认证
孙**(实名认证)
江苏
IP属地:江苏
12
积分
- 关 键 词:
-
PDF+WORD
中文9300字
中英文文献译文
外文
翻译
提取
颜色
特征
动态
匹配
图像
数据库
检索
PDF
WORD
中文
9300
中英文
文献
译文
- 资源描述:
-
外文翻译--提取颜色特征和动态匹配的图像数据库检索【PDF+WORD】【中文9300字】【中英文文献译文】,PDF+WORD,中文9300字,中英文文献译文,外文,翻译,提取,颜色,特征,动态,匹配,图像,数据库,检索,PDF,WORD,中文,9300,中英文,文献,译文
- 内容简介:
-
IEEE TRANSACTIONS ON CIRCUITS AND SYSTEMS FOR VIDEO TECHNOLOGY, VOL. 9, NO. 3, APRIL 1999501Extracting Color Features and DynamicMatching for Image Data-Base RetrievalSoo-Chang Pei,Senior Member, IEEE, and Ching-Min ChengAbstractColor-based indexing is an important tool in imagedata-base retrieval. Compared with other features of the image,color features are less sensitive to noise and background compli-cation. Based on the human visual systems perception of colorinformation, this paper presents a dependent scalar quantizationapproach to extract the characteristic colors of an image as colorfeatures. The characteristic colors are suitably arranged in orderto obtain a sequence of feature vectors. Using this sequence offeature vectors, a dynamic matching method is then employed tomatch the query image with data-base images for a nonstationaryidentification environment. The empirical results show that thecharacteristic colors are reliable color features for image data-base retrieval. In addition, the proposed matching method hasacceptable accuracy of image retrieval compared with existingmethods.Index TermsColor extraction, color index, dynamic match-ing, image data-base retrieval.I. INTRODUCTIONTHE fast evolution of workstations and network technol-ogy has made visual communication service more andmore popular. One application of this service is allowing net-work users to view materials of remote image data basese.g.,a trademarks data base, multimedia data base, etc.on theirworkstations. To make this application realistic, the effectiveindexing and retrieval of image data bases is essential. In1 and 2, texture keyword is described as a means forimage indexing and retrieval. This approach searches imagedata bases on the keyword records, and the associated imagesare retrieved on the completeness of the texture search. Someapproaches 3 using computer vision techniques are alsoconsidered. In computer vision, the identification of objectsis always the major task. Traditionally, the identification isbased on the geometrical features of the object, e.g., thegray spatial moments, the area, the extremal points, and theorientation. This means of identification suffers from variationsin the geometrical features in the scene, most importantly:distractions in the background of the object, viewing the objectfrom a variety of viewing points, occlusion, varying imageresolution, and varying lighting conditions.Manuscript received January 2, 1997; revised October 1, 1998. This workwas supported by the National Science Council, R.O.C., under ContractNSC88-2213-E-002-047. This paper was recommended by Associate EditorS. Panchanathan.S.-C. Pei is with the Department of Electrical Engineering, National TaiwanUniversity, Taipei, Taiwan, R.O.C. (e-mail: .tw).C.-M. Cheng is with the Applied Research Laboratory, TelecommunicationLaboratories, Ministry of Communications, Taipei, Taiwan, R.O.C. (e-mail:.tw).Publisher Item Identifier S 1051-8215(99)03051-7.On the contrary, color, unlike image irradiance, can beused to compute image statistics independent of geometricalvariations in the scene 4. With the help of color-basedfeatures, Swain and Ballard have presented a color indexmethod 5, called the histogram intersection method. Intheir method, they use the color histograms to create three-dimensional (3-D) histogram bins for each query image. The3-D bins are then matched with those of data-base imagesby using the histogram match value. Although this approachis useful in many situations, performance degrades signifi-cantly in changing illumination. To alleviate this problem,Funt and Finlayson 18 have extended Swains method todevelop an algorithm called color constant color indexing,which recognizes objects by matching histograms of colorratios. Reference 18 reported that Funts algorithm doesslightly worse than Swains under fixed illumination, butsubstantially better than Swains under varying illumination.Recently, Slater and Healey 19 have derived local colorinvariants, which capture information about the distributionof spectral reflectance, to recognize three-dimensional objectswith good accuracy. Slaters algorithm is independent of objectconfiguration and scene illumination, but the computationalcost of processing an image is high.On the other hand, Mehtre et al. 6 have developed thedistance method and reference color table method to do colormatching for image retrieval. The distance method uses themean value of the one-dimensional (1-D) histograms of eachof the three color components as the feature. Then a measureof distance between two feature vectors, in which one is froma query image and the other is from a data-base image, isused to compute the similarity of two feature vectors. In thereference color table method, a set of reference colors is firstdefined as a color table. Each color pixel in the color imageis assigned to the nearest color in the table. The histogramof pixels with the newly assigned colors is then computed asthe color feature of the image. For image retrieval, the colorfeature of the query image is matched against all images in thedata base by a similarity measure to obtain the correct match.Different from the above-mentioned color-based methods,which use the color histogram as a feature, we propose in thispaper the usage of the representative colors of an image asthe features for identification. This approach is similar to whatthe human visual system does to capture color informationof an image. When viewing the displayed image from adistance, our eyes integrate to average out some fine detailwithin the small areas and perceive the representative colorsof the image. The representative colors, called characteristic10518215/99$10.00 1999 IEEE502IEEE TRANSACTIONS ON CIRCUITS AND SYSTEMS FOR VIDEO TECHNOLOGY, VOL. 9, NO. 3, APRIL 1999Fig. 1.The partition of color space by the DSQ.colors in this paper, are extracted by a dependent scalarquantization (DSQ) algorithm and ordered to obtain a sequenceof feature vectors. Matching the characteristic colors betweena query image and each data-base image is achieved byusing a dynamic matching method, which is based on thedynamic programming equation in order to overcome varyingviewpoints of the image, occlusion, and varying lightingconditions and to obtain the best match path. A dynamicprogramming approach has shown great potential in solvingdifficult problems in optical character recognition (OCR) 20,and especially speech recognition 21. Combining the DSQalgorithm and the dynamic matching method, a mechanismis proposed for image data-base retrieval. We demonstrate theproposed mechanism with a set of experiments, which comparethe performance of existing methods with that of the proposedmechanism.This paper is organized as follows. The approach to extract-ing the characteristic colors as the color features is presented inSection II. Section III describes how to use the proposed colorfeatures to carry out the tasks of image data-base retrieval.Section IV reports the simulation results of the proposedmatching method. Conclusions are drawn in Section V.II. APPROACH TOCOLORFEATUREEXTRACTIONIn the early perception stages of human beings, similarcolors are grouped together to fulfill the tasks of identification.Based on this assumption, we present a scheme in this sectionto group similar colors of an image and extract the charac-teristic colors as the features for identification. The proposedscheme for extracting the color features is similar to the designof the color palette for an image. It requires quantization of theoriginal full-color image into the limited color image. Amongmany approaches to the design of the color palette 7, 8, theDSQ approach proposed by Pei and Cheng 9 has been shownto perform well. We employ the DSQ approach to generatea color palette that contains the characteristic colors used asthe color features. This approach for the design of the colorpalette is reviewed below.A. Dependent Scalar QuantizationAs displayed in Fig. 1, the DSQ approach partitions colorspace of an image in a dependent way in order to fully utilizethe correlations of the color components of an image. Thepartition used is the 1-D moment-preserving (MP) technique.The advantage of the MP quantizer is that the output canbe written in the closed form 10. When using anotherfidelity criterion, such as mean square error (MSE) 11, it isusually impossible to obtain a closed-form expression for thequantizer. Also, using this DSQ procedure, the computationalcost is rather small.The DSQ mainly consists of two stages, namely, the bitallocation and recursive binary MP thresholding. Suppose theset of all grid points of the image is denoted by, and itsmembers that belong tomay be explicitly written as,whereis the row index andis the column index. Thecolor value ofis denoted as, whereis theassociated color component value.1) Bit Allocation: If the DSQ expects to be unsupervised,the number of thresholding levels in each color componentmust be decided a priori. Since the number of thresholdinglevels can be expressed by powers of two, we choose bits torepresent the total thresholding levels. The purpose of the bitallocation is to automatically assign a given quota of bits toeach color component of the image, that is, a fixed numberof bitshas to be divided among the color components,. The bit-allocation method typically assigns morebits to the color components with larger variance and fewerbits to the ones with smaller variance. The reason for thischoice is that if the data are very spread out on a particularcolor component, then presumably the data in that componentare more significant than that of other, more densely groupedcomponents.The significant component should be given more bits inorder to reduce the quantization error. If the probability densityfunctions are the same for all the components, exceptfor their variance, an approximate solution to the bit allocationof each componentis given by 12(1)whereis the variance of the component. After thebit allocation, the number of thresholding levels of eachcomponentforhas been determined. Then,the recursive binary MP thresholding is used to partition eachcomponent.PEI AND CHENG: EXTRACTING COLOR FEATURES FOR DATA-BASE RETRIEVAL503(a)(b)Fig. 2.Recursive binary moment-preserving thresholding.2) Recursive Binary MP Thresholding: Employing the bi-nary MP thresholding on a color component, the totalpixels are thresholded into two pixel classes, i.e., the below-threshold pixels and the above-threshold ones. The problemof binary MP thresholding, as Fig. 2(a) illustrates, is to selecta threshold valuesuch that if the component valuesof all below-threshold pixels are replaced by, and thoseof all above-threshold pixels are set to, then the first threemoments of color componentfor the imageare preservedin the resulting two-level image. Given the color value of eachpixel, the th momentof color componentis defined as(2a)whereis the total pixel number. Theth moment can alsobe computed from the distribution ofvalues, or histogram,as follows:(2b)whereis the total distribution levels in thehistogramandis the number of pixels with the valuein thehistogram.Letanddenote the fraction of the below-thresholdand above-threshold pixels in the imageafter the binaryMP thresholding. Then the resultantthmoments of thetwo-level image are just(3)If we preserve the first threemoments of the resultanttwo-level image equal to those of original image, we get the following moment-preservingequations:After some simple computations,can be obtained as(4)From, the desired thresholdis chosen as the-tileof thehistogram such that(5)Now considering partitioning color componentintointervals, we use the binary MP thresholding technique re-cursively to obtain the desired thresholding levels instead ofdirectly applying the multilevel thresholding technique as in10. The range of the color component, with boundaryvalues zero and, is first partitioned into two inter-valsand, whererepresents thethreshold obtained by first binary MP thresholding. Then theintervalis partitioned into two subintervalsand, whereis the threshold obtained bysecond binary MP thresholding. Similarly,ispartitioned into two subintervalsandby third binary MP thresholding. The above procedures areillustrated in Fig. 2. If we continue the procedure recursivelyby using the binary MP thresholdingtimes andorder the resultant thresholds according to their values, theoutput thresholding levelswill bedetermined, i.e.,for.3) The DSQ Algorithms: Based on the above discussionof bit allocation and recursive binary MP thresholding, thealgorithm to generate the DSQ color palette can be describedas follows.Step 1) Input the image and utilize the bit-allocation pro-cedure to determine the total thresholding levels ofeach color component.Step 2) Do the recursive binary MP thresholding onandcalculate the associated thresholding levels.Step 3) Do the following stepstimes:3.1) Perform the recursive binary MP thresholdingonof pixels within the rectangular boxbounded byand calculate theassociated thresholding levels.3.2) Do the following steptimes:504IEEE TRANSACTIONS ON CIRCUITS AND SYSTEMS FOR VIDEO TECHNOLOGY, VOL. 9, NO. 3, APRIL 1999(a)(b)Fig. 3.The ordering of characteristic colors. (a) Layer l. (b) Layer l?) Calculate the recursive binary MP threshold-ing onof the pixels within the long barlimited byandand compute the associated threshold-ing levels.Step 4) Assign the centroid of color points inside a cubeformed by Steps 2) and 3) as a characteristic colorof the color palette.B. Color Features from the DSQ ApproachAfter using the DSQ approach, the input colors of the imageare grouped intodisjoint sets,.is the predetermined color-palette size. In each set, thereFig. 4.The images of 27 objects in the first data base.is a characteristic colorchosen. These characteristic colorsconstitute the color palette. We defineas the set of the color features that represent the original image.The DSQ approach requirestimes binary MP thresholding to finish colorspace partition. It takes approximatelymultiplication oper-ations for binary MP thresholding. So, the total multiplicationsfor the recursive binary MP thresholding will be approximately.Furthermore, the partition of color space by the DSQapproach shown in Fig. 1 causes the characteristic colors tobe distributed layer by layer in the 3-D space. We arrangethese characteristic colors in the 1-D way in order to processthem efficiently in the mechanism of image data-base retrieval.As shown in Fig. 3, the characteristic colors are orderedsequentially in the 1-D way, wheredenotes the index ofthe associated characteristic color. The value of the index isincreasing from the bottom layer to the upper layer. Throughthis ordering, two neighboring indexes would represent twovisually close colors. The image could be represented by asequence of feature vectors(6)To illustrate the color features extracted by the proposedapproach, we have chosen three test images from a data baseof 27 images, shown in Fig. 4. The red, green, blue (RGB)images in the data base are 159119 size coded at eight bitsfor each color component. For each test image, we utilize theDSQ approach, which allocates three bits to color components,to obtain a color feature setof size eight. Settingin (1), we have acquiredfor. Thus, binaryMP thresholding is used once for each color component in thesequential partition of color space shown in Fig. 1. In Fig. 5,we show test images, the corresponding color feature sets, andthe quantized images, which used the characteristic colors ineach color feature set to form the associated color palette.It is noticed from Fig. 5(c) that color information insidethe quantized images is similar to that of the correspondingtest images. That is, the characteristic colors in each colorPEI AND CHENG: EXTRACTING COLOR FEATURES FOR DATA-BASE RETRIEVAL505(a)(b)(c)Fig. 5.Results of the DSQ approach to extract the color feature sets: (a) original images, (b) corresponding color feature sets, and (c) correspondingquantized images by using the characteristic colors to form the color palette.feature set of Fig. 5(b) can represent color information insidethe corresponding image. Therefore, the characteristic colorsextracted by our DSQ approach are reliable and able to beused as the color features of the image.III. MECHANISM OFIMAGEDATA-BASERETRIEVALThis section describes how to use the sequence of featurevectors, which are the characteristic colors extracted by theDSQ approach, to identify a query image from data-baseimages. The problem of identification can be described asfollows. Ifis the query image andis one of imagesin the data base, minimize the distortionbetweenand. The goal thus is to find an image in the data basewhose distortion satisfiesfor(7)whereis the number of data-base images. Concerningthe distortionbetween two images, we designate it asthe distance between two sequences of feature vectors. Tocalculate this distance, a measure of closeness between twofeature vectors has to be found. Since each feature vector isa characteristic color, the measure used in this paper is thecolor distance. Then we will introduce a dynamic matchingmethod to obtain the distance between two sequences offeature vectors.A. Color DistanceA color space describing colors close to human perceptionis crucial in calculating color difference based on color percep-tual similarity. The RGB color space, which is normally usedin the frame grabber of color-processing systems, does notcarry direct information about the color. For example, peoplecannot distinguish between colors by their (R,G,B) triplet.Contrary to the RGB color space, the empirically definedhue, value, saturation (HVC) space, which is a variant of theMunsell color space, expresses colors in terms of triattributesof human color perception. Hafner et al. 16 have foundthat the HVC color space gives the best performance (usinghuman judgment of the similarity between the query and data-base images) for experiments with a variety of color spaces.Processing color images based on the HVC space, existingprocessed errors can be evaluated according to how humanbeings perceive the errors. To calculate color difference in theHVC color space, we adopt Godloves formula 13 instead ofa Euclidean distance measure. Godloves formula defines colordifference directly related to the National Bureau of Standards(NBS) unit color difference, which is for the assessment ofcolor difference 17.There are several ways to mathematically transform betweenthe RGB and HVC color space. The transformation used in thispaper involvescolor space. If a set of RGB colorvalue is given, the transformation of color value from the RGBcolor space to the HVC color space is as follows 14.Step 1) RGB color value is first transformed into XYZcolor value as506IEEE TRANSACTIONS ON CIRCUITS AND SYSTEMS FOR VIDEO TECHNOLOGY, VOL. 9, NO. 3, APRIL 1999Step 2) A nonlinear process concerned with a human visionmodel is performed aswhere.Step 3) HVC color value is obtained by the formula(8)whereand.For two colors in the HVC color space, expressed asand, Godloves formula of colordistanceis defined by(9)whereand.B. Dynamic Matching MethodIf the conditions of identification of the query image areconsistent, the images of the same object captured at differentinstants will also not differ much in their histograms of colorcomponents. The locations of their associated characteristiccolors with the same index will thus be close. In this circum-stance, the distance between two sequences of feature vectorscan be defined as the sum of color distances between twocharacteristic colors with the same index. That isHowever, due to varying viewpoints of the image, occlusion,and varying lighting conditions, the environment for identify-ing the query image is nonstationary. To find the color distanceof two sequences of feature vectors in these situations, wepropose a dynamic matching method, depicted in Fig. 6(a),where input sequencesandare developed along thehorizontal-axis and vertical-axis, respectively. We try tofind a path that represents the best match between two inputsequences. This best match path is shown as the sequenceinFig. 6(a). After this match path is obtained, we then calculatethe color distance of each matched pair of characteristic colors,which are feature vectors fromand. Last, the distancebetween two sequences of feature vectors is defined as thesum of these color distances.The sequencecan be considered to represent the outputof matching functionfrom the axis of sequence(a)(b)Fig. 6.(a) Dynamic matching method. (b) Dynamic programming equationof matching function.onto that of sequence. In Fig. 6(a), we suppose thatandare expressed aswhereis the index of the characteristic color andis thelength of the sequence.andare the characteristic colorswith color value. The matching function is denotedaswhere. When there is no differencebetweenand, the matching function coincides with thediagonal line. It deviates further from the diagonal lineas the difference grows.The matching process is normally performed by using adynamic programming equation 15. In our case, the equationis given by(10)whereis the partial sum of distance related to pointalong the best path from pointto pointandis the color distance between two matched featurevectors or characteristic colors. Fig. 6(b) illustrates how thepartial sum of distance is obtained. In (10), we employ colorPEI AND CHENG: EXTRACTING COLOR FEATURES FOR DATA-BASE RETRIEVAL507(a)(b)Fig. 7.(a) The images of 27 query objects. (b) Query images with blurring, 19-dB, and 9-dB degradations from left to right.TABLE ICOMPARATIVEPERFORMANCE: QUERYIMAGES FORDATABASEIdifference from Godloves formula, which is expressed in (9)to calculate.The summation of color distancesby (10) willreach its minimum value when matching functionis ob-tained so as to optimally adjust the difference between se-quencesand. This minimum distance valueisthen denoted as the distance between sequencesand(11)TABLE IICOMPARATIVEPERFORMANCE: QUERYIMAGES OFTABLEIWITHDISTORTIONThe computational complexity of the dynamic matchingmethod is mainly due to determining the partial sum of (10) ineach point. In (10), it requires three addition calculations,one color distance calculation by Godloves formula, and508IEEE TRANSACTIONS ON CIRCUITS AND SYSTEMS FOR VIDEO TECHNOLOGY, VOL. 9, NO. 3, APRIL 1999TABLE IIICOMPARATIVEPERFORMANCE: SCALEDQUERYIMAGES FORDATABASEIFig. 8.Effects of changing intensity on match success of the proposedmethod, Swains method, Funts method, the distance method, and thereference color table method.one comparison. The total number of processing pointsis proportional to. Ifis smallfor example, eightin the experiments below, this would not cost too muchcomputational overhead.IV. EXPERIMENTALRESULTSTo illustrate the performances of the proposed matchingmethod, we built 27 query images, shown in Fig. 7(a). Com-pared with the first data base shown in Fig. 4, the objects as(a)(b)Fig. 9.The 65 trademark images in the second data base and six queryimages. (a) 65 trademark images. (b) Six query images:? ? ? ? ?,and?.TABLE IVCOMPARATIVEPERFORMANCE: QUERYIMAGES FORDATABASEIIshown in Fig. 7(a) are translated, rotated, and occluded. Forthese query images, we have utilized:1) Swains histogram intersection method 5;2) Mehtres distance method 6;3) Mehtres reference color table method 6;4) the proposed dynamic matching method to see if thequery objects can be distinguished from the data baseof Fig. 4 and to evaluate the sensitivity of the abovePEI AND CHENG: EXTRACTING COLOR FEATURES FOR DATA-BASE RETRIEVAL509(a)(b)(c)(d)Fig. 10.Sample query results of different matching methods: (a) Swains method, (b) the proposed method, (c) the distance method, and (d) thereference color table method.matching methods to noise, changes in resolution, andvarious lighting conditions.The color space used in the experiments is the RGB space.Concerning the proposed matching method, the length of thesequence of feature vectors chosen is eight. The number of3-D histogram bins used by Swains method is 2048. In thereference color table method, the reference color table contains27 colors, as suggested by 6. The matching percentile 5 foreach query image is calculated asmatching percentilerankwhereis the total number of objects in the data base andrank is the position of correct match after matched results aresorted.Table I shows the matching success of the above testingmethods for the query images. As we observe, Swains andthe proposed methods can correctly identify each query image.To see how noise affects the matching accuracy, we addedtwo sets of Gaussian noise to the query images with thesignal-to-noise-ratio (SNR) value defined asSNRwhere, andare the input color component power., andare the noise color component power.SNR ranges from 19 to 22 dB for the first set of noiseand 7 to 12 dB for the second set of noise. For each pixelof every query image, we also blurred the query image byaveraging pixel values within an 1111 window centered atit in order to see how this distortion would affect the matchingaccuracy. Fig. 7(b) displays one query image with the aboveadded degradations. Table II illustrates the matching successof the testing methods for these noise-added and blurredimages. As we observe, the proposed method achieves a betteraverage match percentile than both the distance method andthe reference color table method. It missed four query objectsin the blurred query images. Swains method does not havegood matched results for the query images added with thesecond set of noise as compared with the proposed method.In these empirical results, our current implementation of theDSQ algorithm and the dynamic matching method takes lessthan 2 s of CPU time to process a query image with size 159119 on a SunSPARC Station 10. Of this total time, about1 s is needed for the DSQ approach; about 1/3 of a second isneeded for matching the query image with one image in thedata base.510IEEE TRANSACTIONS ON CIRCUITS AND SYSTEMS FOR VIDEO TECHNOLOGY, VOL. 9, NO. 3, APRIL 1999(a)(b)Fig. 11.(a) The 61 images in the third data base. (b) Query results by the proposed method.The effect of reducing the resolution of the query imagesis investigated and shown in the Table III. As we see, theproposed method performs the best among the testing methods,especially for the resolution 97. To simulate the changinglighting condition, we have multiplied the image pixel valuesby a constant factor ranging from 0.5 to 1.5 for each queryimage. The resultant pixel values were constrained to beno greater than 255. The transformed images were thenmatched to the data base of Fig. 4. The results are displayedin Fig. 8. In this simulation, we have also included Funtsmethod 18, which is specifically designed to be invariantto lighting. Except for our method and Funts method, thematching accuracy is degraded significantly when the factorvalues approach 1.5 and 0.5. As we know, the effect ofmultiplying each pixel with a factor value would cause thehistogram of color components to be changed notably. Forthe histogram-matching methods, like Swains method and thecolor reference table method, this multiplication effect thuslowers the matching accuracy. Since Funts method uses theratios of color RGB triples from neighboring locations, itsresults are constant for this simulation. As can be verified,Funts method has the best performance for a factor value of1.5. On the other hand, since the order of the sequence offeature vectors does not change too much by these valuesof factor, our method generally has good performance ascompared with the other testing methods. However, with aPEI AND CHENG: EXTRACTING COLOR FEATURES FOR DATA-BASE RETRIEVAL511multiplication factor larger than 1.5, it is expected that thematching accuracy of our method will degrade gradually.In addition, we constructed a second data base of 65trademark images to test another image data-base retrievalscenario. Given a query image, we want to obtain a list ofimages from the trademark data base that most resemble colorsin the query image. We captured six query images, designatedfrom T1 to T6, and manually listed the resembling imagesfound in the data base for each query image. Each data-baseand query image with size 128128 consists of 8-bit “red,”“green,” and “blue” color components. The data-base andquery images are displayed in Fig. 9. Letbe the number ofmanually listed images for each query image. We applied theabove testing methods to each query image. A list of imageswith sizefor each query image is then created to check howmany images are correctly matched. Ifis the number ofsuch matched images, the retrieval efficiencycan be definedas follows:In Table IV, we show the retrieval efficiency of each queryimage for two color spaces, the RGB and opponent color axes5, withshown below the notion of each query image. Aswe observe, the proposed method has better average retrievalefficiency than the other testing methods for this trademarkretrieval scenario. The performance of the RGB and that ofthe opponent color axes is similar for the proposed method.However, for Swains method, opponent axes have betterresults. The retrieval results of the second query image forthe RGB space is also illustrated in Fig. 10.Finally, we have applied the proposed method to a database of more natural images with a wider range of colorsin order to give more insight into its performance. The database shown in Fig. 11(a) is populated by 61 color images,which have 8-bit “red,” “green,” and “blue” color components.Each time, we picked a color image from the data base as thequery sample in order to retrieve the top four images thatmost resembled in colors. Fig. 11(b) displays three examplesof image retrieval. Although the first candidate may notsubjectively be the best match, it can be observed that allthe retrieved images contain color content quite similar tothe query sample. To accurately rank retrieved images, weunderstand that other features like shape, texture, and colorhistogram can be adopted. This approach is still open forfurther study.V. CONCLUSIONThis paper presents an approach of color feature extractionfor image indexing. This approach, called DSQ, is similar tothe design of a color palette for an image. The DSQ approachfirst extracts the characteristic colors, which contain colorinformation of the image, by sequentially quantizing colorcomponents. The chosen characteristic colors are then orderedto obtain a sequence of feature vectors. Using this sequenceof feature vectors, a matching method, called the dynamicmatching method, is proposed to carry out the task of imagedata-base retrieval. The proposed method is mainly basedon the dynamic programming equations to obtain the bestmatch path between two input sequences of feature vectors.The color distance used to obtain the best match path isGodloves formula for the HVC color space. The experimentalresults show that the characteristic colors are reliable colorfeatures for image indexing. The proposed matching methodhas acceptable accuracy of image retrieval as compared withexisting color indexing methods mentioned in Section IV. Inaddition, we have analyzed the computational complexity forboth the DSQ approach and the dynamic matching method.Our results show that the proposed method is feasible forimplementation.ACKNOWLEDGMENTThe authors would like to thank the anonymous reviewers,who made many useful comments. Their help is gratefullyappreciated.REFERENCES1 T. L. Kunii, Visual Database System.Amsterdam, The Netherlands:Elsevier, 1989.2 E. Knuth and L. M. Wegner, Visual Database System II.Amsterdam,The Netherlands: Elsevier, 1992.3 R. M. Haralick and L. G. Shapiro, Computer and Robot Vision. Reading,MA: Addison-Wesley, 1993.4 G. Healey, “Using color for geometry insensitive segmentations,” J. Opt.Soc. Amer. A, vol. 6, pp. 920937, June 1989.5 M. J. Swain and D. H. Ballard, “Color index,” Int. J. Comput. Vision,vol. 7, no. 1, pp. 1132, July 1991.6 B. M. Mehtre, M. S. Kankanhalli, A. D. Narasimhalu, and G. C. Man,“Color matching for image retrieval,” Pattern Recognit. Lett., vol. 16,pp. 325331, Mar. 1995.7 P. Heckbert, “Color image quantization for frame buffer display,”Comput. Graph., vol. 16, no. 3, pp. 297307, July 1982.8 M. T. Orchard and C. A. Bouman, “Color quantization of images,” IEEETrans. Signal Processing, vol. 39, pp. 26772690, Dec. 1991.9 S. C. Pei and C. M. Cheng, “Dependent scalar quantization of colorimages,” IEE
- 温馨提示:
1: 本站所有资源如无特殊说明,都需要本地电脑安装OFFICE2007和PDF阅读器。图纸软件为CAD,CAXA,PROE,UG,SolidWorks等.压缩文件请下载最新的WinRAR软件解压。
2: 本站的文档不包含任何第三方提供的附件图纸等,如果需要附件,请联系上传者。文件的所有权益归上传用户所有。
3.本站RAR压缩包中若带图纸,网页内容里面会有图纸预览,若没有图纸预览就没有图纸。
4. 未经权益所有人同意不得将文件中的内容挪作商业或盈利用途。
5. 人人文库网仅提供信息存储空间,仅对用户上传内容的表现方式做保护处理,对用户上传分享的文档内容本身不做任何修改或编辑,并不能对任何下载内容负责。
6. 下载文件中如有侵权或不适当内容,请与我们联系,我们立即纠正。
7. 本站不保证下载资源的准确性、安全性和完整性, 同时也不承担用户因使用这些下载资源对自己和他人造成任何形式的伤害或损失。

人人文库网所有资源均是用户自行上传分享,仅供网友学习交流,未经上传用户书面授权,请勿作他用。
2:不支持迅雷下载,请使用浏览器下载
3:不支持QQ浏览器下载,请用其他浏览器
4:下载后的文档和图纸-无水印
5:文档经过压缩,下载后原文更清晰
|