US20070127043A1 - Image processing apparatus and control method thereof - Google Patents
Image processing apparatus and control method thereof Download PDFInfo
- Publication number
- US20070127043A1 US20070127043A1 US11/564,426 US56442606A US2007127043A1 US 20070127043 A1 US20070127043 A1 US 20070127043A1 US 56442606 A US56442606 A US 56442606A US 2007127043 A1 US2007127043 A1 US 2007127043A1
- Authority
- US
- United States
- Prior art keywords
- image
- page
- background image
- data
- text
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
- 238000012545 processing Methods 0.000 title claims abstract description 122
- 238000000034 method Methods 0.000 title claims description 13
- 238000007906 compression Methods 0.000 claims abstract description 39
- 230000006835 compression Effects 0.000 claims abstract description 39
- 238000000926 separation method Methods 0.000 claims description 12
- 230000002093 peripheral effect Effects 0.000 claims description 8
- 238000004590 computer program Methods 0.000 claims description 6
- 238000004364 calculation method Methods 0.000 claims description 4
- 238000010276 construction Methods 0.000 description 6
- 239000003086 colorant Substances 0.000 description 5
- 238000012217 deletion Methods 0.000 description 4
- 230000037430 deletion Effects 0.000 description 4
- 230000006870 function Effects 0.000 description 4
- 238000010586 diagram Methods 0.000 description 3
- 238000010422 painting Methods 0.000 description 3
- 230000008901 benefit Effects 0.000 description 2
- 238000013144 data compression Methods 0.000 description 2
- 238000006243 chemical reaction Methods 0.000 description 1
- 230000000694 effects Effects 0.000 description 1
- 239000004973 liquid crystal related substance Substances 0.000 description 1
- 239000011159 matrix material Substances 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 238000009877 rendering Methods 0.000 description 1
Images
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N1/00—Scanning, transmission or reproduction of documents or the like, e.g. facsimile transmission; Details thereof
- H04N1/46—Colour picture communication systems
- H04N1/64—Systems for the transmission or the storage of the colour picture signal; Details therefor, e.g. coding or decoding means therefor
- H04N1/642—Adapting to different types of images, e.g. characters, graphs, black and white image portions
Definitions
- the present invention relates to a technique of performing compression coding on a document image constructed with plural pages, and generating one file.
- image data (particularly color image data) has a large data size.
- image data is stored on a server provided on a network
- a network traffic problem occurs.
- a large-capacity storage device is required.
- the technique of changing the compression method in accordance with a characteristic or attribute of the image data e.g., Japanese Patent Application Laid-Open No. 10-51642 and No. 10-215379
- the technique of extracting layer structures of an image and performing efficient compression for each of the layers Japanese Patent Application Laid-Open No. 2000- 306103
- one sheet of document image data can be compressed to image data of a small data size, while maintaining reasonable image quality.
- one page of uncompressed image data can be compressed as small the amount as 1/10.
- a storage device having a certain capacity can store and manage 10 times as large the document volume.
- the compression rate can further be increased, an extremely high compression rate results in unclear text in a decoded document and makes the texts difficult to read.
- document images are stored and managed in units of sheet and are independent of each other, thus having no cause-and-effect relationship.
- a document prepared for presentation used in an office conference or the like is often constructed with plural pages, and many of them are generated by a PC.
- a background image is often provided to each page for better appearance.
- the inventor of the present invention has focused attention to the point that the background images of respective pages are highly likely the same in a case of such document prepared for presentation.
- the present invention takes advantage of this feature of the document having the same background image, and provides a technique for further improving the compression rate while maintaining high image quality.
- an image processing apparatus has, for instance, the following configuration. More specifically, the present invention provides an image processing apparatus that performs compression coding on image data and stores the image data as an electronic file, comprises
- a file when a file is generated from image data having plural pages, coded data of a background portion that is common to plural pages is generated and shared. Accordingly, a file compressed at high compression rate can be generated while maintaining high image quality.
- FIG. 1 is a block diagram showing a construction of a data processing apparatus according to an embodiment of the present invention
- FIG. 2 is a diagram showing an example of a network system to which the embodiment is applicable;
- FIG. 3 is a flowchart showing the main processing of the data processing apparatus according to the embodiment.
- FIG. 4 is a flowchart of the page processing shown in FIG. 3 ;
- FIG. 5 is a flowchart of the comparison processing shown in FIG. 4 ;
- FIG. 6 is a view showing binary image generation processing based on original image data according to the embodiment.
- FIG. 7 is an explanatory view of isolated pixel removing processing according to the embodiment.
- FIG. 8 is an explanatory view of image region division processing from a binarized result according to the embodiment.
- FIGS. 9A and 9B show a tree structure where respective nodes have data of respective regions generated by image region division, and a data content of each node;
- FIG. 10 is an explanatory view of non-text image deletion processing according to the embodiment.
- FIG. 11 is an explanatory view of text color decision processing according to the embodiment.
- FIG. 12 is an explanatory view of text deletion processing according to the embodiment.
- FIGS. 13A to 13 C are explanatory views of text image deleted portion filling (painting) processing according to the embodiment.
- FIGS. 14A and 14B are views showing a background image and a text image separated from each other;
- FIG. 15 is a view showing a background image which is divided into blocks
- FIG. 16 is a view showing how block images on the first page are stored
- FIGS. 17A to 17 D are views showing text images separated in units of color
- FIG. 18 is a view showing a binarized result of a text image
- FIG. 19 is an explanatory view of a background image and a text image of the second page separated from each other;
- FIG. 20 is a view showing a background image of the second page which is divided into blocks
- FIG. 21 is an explanatory view of pixel block comparison processing between the first page and second page
- FIG. 22 shows an arithmetic expression of a pixel block similarity level
- FIGS. 23A and 23B are views showing how pixel blocks of a page-unique portion and a common portion in a background image are stored upon completion of second page processing;
- FIG. 24 is an explanatory view of a background image and a text image of the third page separated from each other;
- FIGS. 25A and 25B are views showing how pixel blocks of a page-unique portion and a common portion in a background image are stored upon completion of third page processing;
- FIG. 26 is a view showing how the page-unique portion is pasted on the first to third pages respectively;
- FIG. 27 is a view showing how the common background portion is pasted on the first to third pages respectively;
- FIG. 28 is a view showing restored images of the first to third pages
- FIG. 29 shows an example of the common background image according to the second embodiment
- FIG. 30 is an explanatory view of superimposing a binary image on a background image according to the third embodiment
- FIG. 31 is an explanatory view of pixel block segmentalization and comparison processing according to the fourth embodiment.
- FIG. 32 shows how segmentalization is applied in the fourth embodiment.
- FIG. 1 is a block diagram showing a construction of a data processing apparatus 100 according to the present embodiment. Assume that the data processing apparatus 100 according to the present embodiment has a hardware configuration of a general-purpose data processing apparatus, e.g., a personal computer (PC).
- a general-purpose data processing apparatus e.g., a personal computer (PC).
- numeral 1 denotes a CPU for controlling the entire apparatus; 2 , ROM for storing a boot program and BIOS; and 3 , RAM used as a work area of the CPU 1 .
- an operating system (OS) various processing programs, and furthermore data are read and processed by the CPU 1 .
- Numeral 4 denotes a hard disk device (HDD) serving as an external storage device.
- an OS 4 a and a processing program 4 b according to the present embodiment are installed in the HDD 4 .
- the HDD 4 also stores data files generated by various application programs and image data files inputted by an input device, such as a scanner, which will be described later.
- Numeral 5 denotes a keyboard; and 6 , a pointing device such as a mouse.
- Numeral 7 denotes a video memory and a display controller which performs rendering processing on the video memory, image data reading from the video memory at predetermined timing, and video signal output to a display device 8 .
- the display device 8 may be of a CRT, a liquid crystal display device or the like.
- Numeral 9 denotes a scanner interface (I/F) for connecting an image scanner 10 that reads an original document. Typical examples of the scanner I/F 9 include an SCSI interface, a USB interface, and the like.
- the image scanner 10 comprises an auto-document feeder (ADF) that has a function for sequentially conveying plural sheets of document to a reading position and reading the document.
- Numeral 11 denotes a network interface (I/F) such as an Ethernet.
- the CPU 1 executes the boot program stored in the ROM 2 and loads the OS from the HDD 4 to the RAM 3 to function as a data processing apparatus. Thereafter, when a user operates the keyboard 5 and the pointing device 6 to start the processing program 4 b (load the program in the RAM 3 and execute), the apparatus functions as a device that performs image data compression processing according to the present embodiment.
- an original document having plural pages set on the image scanner 10 are sequentially read in accordance with a user's instruction, and the read image data is stored in the RAM 3 . Then, processing that will be described later is performed on image data of the respective pages, thereby generating one file corresponding to the original document having plural sheets, and the file is stored in the HDD 4 .
- FIG. 2 shows an example thereof.
- the data processing apparatus 101 causes the scanner 200 on the network to read an original document image, and receives the read image data.
- the data processing apparatus 101 sequentially transfers the read image data to a data processing apparatus 102 through a router 300 .
- the data processing apparatus 102 executes the above-described compression processing on the received image data, and transfers the data file generated as a result of compression to a data processing apparatus 103 .
- the data processing apparatus 103 which serves as a database server or a file server, stores and keeps the transferred data file.
- the data processing apparatus 102 performs the processing similar to that of FIG. 1 .
- this configuration can be realized in the environment where plural LANs are connected, thus can be realized through, e.g., the Internet.
- processing content of the processing program 4 a executed by the data processing apparatus 100 is described with reference to the flowchart in FIG. 3 .
- this processing is executed when the processing program 4 b (application) is started and original document reading is instructed by the pointing device 6 on the GUI screen (not shown) provided by the program.
- step S 101 one of the plural sheets of original document set on the scanner 10 is read.
- the pixels constituting the page image data of the read one sheet of original document are multi-valued color image data where R, G and B are respectively expressed by 8 bits.
- the image data expressed in RGB color space is converted to YCbCr (luminance, color difference) space. Since the conversion that is realized by matrix operation is well-known, description thereof is omitted.
- step S 102 page processing is executed on the read page image data.
- the page processing is described in detail with reference to the flowchart in FIG. 4 .
- step S 201 color image data expressed in YCbCr space is binarized, thereby generating binary image data.
- FIG. 6 is an explanatory view of binary image generation processing based on color image data.
- Y values representing a luminance component are extracted from the inputted original image data in YCbCr space, and a histogram 501 is generated. Since the histogram represents the luminance component, the number of pixels constituting texts and line drawings is by far smaller than the number of pixels constituting the background image. Further, since the background image has a higher luminance (lower density) level than that of texts and line drawings, when the distribution line of the histogram is followed from the highest luminance level to a low luminance level, the value that first goes below the threshold value T is acquired as a local minimal value, and the luminance of the local minimal value is decided as a threshold value used in binarization.
- Pixels equal to or lower than the binarization threshold value decided in the foregoing manner are processed as black pixels, and pixels higher than the threshold value are processed as white pixels, thereby obtaining the binary image data 502 shown in FIG. 6 .
- step S 203 noise removal processing is performed on the generated binary image data, thereby producing image data that is subjected to block selection.
- the noise removal processing is equivalent to isolated pixel removing. This is described with reference to FIG. 7 . Note that each pixel of the binary image data is expressed by 1 bit. For descriptive purpose, 1 expresses a black pixel and 0 expresses a white pixel herein.
- the pixel of interest P(x, y) is determined as an isolated pixel, and the value of the pixel of interest P(x, y) is converted to “0”.
- the pixel of interest 701 When the pixel of interest is numeral 700 in FIG. 7 , the pixel is determined as an isolated pixel and is converted to a white pixel.
- the pixel of interest 701 since one of the peripheral pixels, i.e., the neighboring pixel on the right is “1”, the pixel of interest 701 is determined not to be an isolated pixel, and the original binary data is maintained.
- the above processing is sequentially performed with respect to the pixel of interest P(x, y) by updating x and y in accordance with raster scanning, thereby executing binarization based on the luminance component of the inputted image.
- step S 203 the binary image data, on which above-described noise removal processing has been performed, is inputted, then block selection processing (image region division processing that divides the image into blocks in units of attribute) is performed, and block information and attribute information are added to the image data before being outputted.
- block selection processing image region division processing that divides the image into blocks in units of attribute
- an attribute is given to the image region based on the size of the black pixel group included in the binary image data and uniformity of the image block.
- image blocks consisting of titles, texts, graphics, tables, and images (pictures), as well as its attribute, positional information and size of the block.
- FIG. 8 shows the result of image region division using block selection.
- the inputted image data is divided into block regions including eight text regions Text 1 - 8 , one table region Table 1 , and one picture region Picture 1 , as well as three line regions Line 1 - 3 .
- FIG. 9A shows a logical structure (tree structure) obtained as a result of block selection processing.
- processing for text images is performed on all block data obtained herein.
- each node includes attribute information indicative of whether the node is a text region or a table region, the coordinate position (x, y) of the top left corner of the region, and the width and height of the region given that the region is rectangular.
- each node in order to manage the region having a text image as a child node, each node includes a pointer to the child node, and furthermore, a pointer to the next node. Since the table region also includes texts, it has a similar structure.
- step S 204 one region is extracted and it is determined whether or not the region is a text region. This is determined by referring to the node attribute in the logical structure of the block selection shown in FIG. 9B .
- step S 205 if it is determined as a text region, processing in step S 205 is performed. If it is determined as a non-text region, processing in step S 204 - 2 is performed. In this embodiment, since the attribute of the first node (left end node) is a line as shown in FIG. 9A , the region is determined as a non-text region, and processing in step S 204 - 2 is performed.
- FIG. 10 is an explanatory view of non-text image deletion from a binary image. Based on node data of the block selection, an image region to be deleted is specified, and the specified image is deleted from the binary image data.
- step S 208 is executed and a node of the next block selection is acquired. If node data cannot be acquired, in other words, if processing completion on all blocks is determined, processing in step S 209 is executed.
- processing in step S 209 is executed.
- step S 204 it is again determined whether the node attribute of the block selection subjected to processing is a text region or a non-text region. If the node of interest is a text region, processing in step S 205 is executed.
- step S 205 node data of the block selection in a text region is analyzed to acquire text node data which is a final unit of a text block. Color data is added to the acquired text data.
- color data of each pixel constructing the text corresponding to the node data subjected to current processing is extracted from the original YCbCr image data, and a text color is determined and added to the node data.
- step S 206 based on information corresponding to the node that is determined as a text in step S 205 , each pixel constructing the target text image is deleted from the original YCbCr color image.
- FIG. 12 is an explanatory view of text image deletion. Pixel data corresponding to the pixel position having “1” in the binary image is deleted from the YCbCr color image.
- step S 207 filling (painting the blank in the deleted portion) is performed on the color image where text image is deleted, using the color data of peripheral pixels.
- the known number of colors on the periphery of the filling-target pixel pixel value (?, ?, ?)
- filling (painting) is performed with the same color as the peripheral pixels.
- N types of colors in the neighboring pixels two types in FIG. 13B
- an average value of the color components is calculated to determine the color of the filling-target pixel. In this manner, the color of the pixels where the text is deleted is filled with the peripheral color, i.e., the background color.
- steps S 204 to S 208 is performed with respect to all the nodes.
- the text color is determined, then the text image is deleted from the color image, and filling is performed with a peripheral color.
- image data is deleted based on the binary image.
- a color image and a binary image shown in FIGS. 14A and 14B can be obtained. Texts are deleted from the color image, and filling is performed on the deleted text pixels. Image data other than texts is deleted from the binary image.
- step S 209 comparison processing is performed between the color image where text removal and filling have been performed and the color image of the previous page where the similar processing has been performed.
- step S 301 color image data of the target page, on which text removal and filling have been performed, is divided into blocks of an MCU (Minimum Coded Unit), which are the comparison unit.
- MCU Minimum Coded Unit
- image block addresses (1, 1) to (m, n) are assigned to respective blocks.
- Each of the divided blocks is the comparison unit.
- step S 302 it is determined whether or not there is comparison-target image data. If there is no comparison-target image block data in the comparison image area, image processing in step S 303 is performed. Assuming that the target page is the first page, since there is no comparison target, processing of step S 303 is performed. Note that, besides the first page, if it is determined that there is no comparison target, the data in the comparison image area might have been flushed for some reason (e.g., lack of memory).
- step S 303 all data of the target page is stored as a page image in the page comparison image area (secured in the RAM), and the control ends.
- the page comparison image area has areas for storing addresses of respective block images, image data, page data and so forth as shown in FIG. 16 .
- step S 210 the text image is compressed.
- the text image compression is performed after a binary image is generated for each text color.
- FIGS. 17A to 17 D are explanatory views of image data generation in units of text color. Based on text color data of each node acquired in block selection, the same or similar text colors are put together as one text color, and binary image data is generated in units of color. The number of approximating colors is statistically about 16 at most.
- FIGS. 17A to 17 D binary image data of respective colors can be obtained as shown in FIGS. 17A to 17 D.
- MMR processing which is a compression method appropriate for binary image data
- the page image as shown in FIG. 18 is stored as binary image data of the first page.
- step S 103 in FIG. 3 it is determined in step S 103 in FIG. 3 if there is a next page. Since there is still a page, image data of the second page is read and the processing similar to the first page is performed.
- FIG. 19 shows an inputted image of the second page as well as a binary image and a background image obtained as a result of processing in steps S 201 to S 208 .
- step S 301 the inputted background image is divided. Similarly to the first page, the image is divided into MCU blocks.
- the image data is divided in units of 16 ⁇ 16 pixels as shown in FIG. 20 .
- Image block addresses (1, 1) to (m, n) are assigned to the divided blocks respectively, and comparison processing is performed in units of the divided block.
- step S 302 it is determined whether or not there is a comparison-target image block in the comparison image area. Since the currently processed image is a second page, the page comparison image area contains image block data obtained in the previous processing. Therefore, comparison-base image block acquisition processing in step S 304 is performed.
- step S 304 the image block of the second page that serves as a comparison base, is acquired.
- the comparison processing ends.
- step S 304 - 2 image data to be compared with the block data of the original image is extracted from the common comparison image area and page comparison image area.
- FIG. 21 is an explanatory view of comparison image block acquisition processing. As shown in FIG. 21 , all image blocks having the same address as the comparison-base block address are extracted as comparison target data.
- step S 305 the image blocks are compared to calculate the similarity level.
- FIG. 22 shows a similarity level calculation formula used for calculating the similarity level of the image blocks. Using the similarity level calculation formula where a similarity level is expressed by a vector's cosine value, the similarity level between the comparison-base image block and the comparison-target image block is calculated in units of pixel component.
- step S 306 determination is made as to whether or not to consider the comparing image blocks the same image.
- Threshold values are set respectively for similarity levels of respective components Y, Cb and Cr. For instance, similarity level 95 or more is set as a threshold value of luminance data Y; and similarity level 90 is set as a threshold value of color difference data Cb and Cr.
- the image block is stored as the same image in the common image area in step S 307 .
- the image block is stored as a difference image in the page comparison image area in step S 308 .
- step S 307 the image block having a similarity level equal to or larger than the threshold value is stored in the common comparison image area.
- comparison-target image block exists in the common comparison image area
- page data is added to the page data of the corresponding image block address of the common comparison image area. If the comparison-target image block exists in the page comparison image area, the image block data in the page comparison image area is moved to the common comparison image area, and page data of the current comparison-base image block is added.
- step S 308 with respect to the image block having a similarity level less than the threshold value, data of the currently processed image block is stored in the page comparison image area.
- step S 210 binary image data is generated in step S 210 similarly to the first page, thereby registering the second-page image data.
- FIG. 24 shows image data of the inputted image of the third page, in which the text region that is the foreground is separated from the background.
- the inputted image is divided into image blocks, and the image block data is compared to block data in the comparison image area to calculate the similarity level.
- the image block data is stored in the common comparison image area; whereas if the similarity level is less than the threshold value, the image block data is stored in the page comparison image area.
- the image block data is still registered in the common comparison image area. For instance, in a case where the image block data is common in the second and third pages but not in the first page, the image block data of the second and third pages is registered in the common comparison image area and the image block data of the first page is left in the page comparison image area.
- image block data is stored in the comparison image areas shown in FIGS. 25A and 25B .
- step S 210 binary image data is generated in step S 210 as similar to the first and second pages, and the generated binary image data is registered as the third-page image data.
- step S 103 in FIG. 3 it is determined based on the status signal from the image scanner 10 whether or not data indicative of no unread original document is received. If comparison processing is completed for all pages, the control proceeds to step S 104 .
- step S 104 the image block data stored in the page comparison image area is added to the page data as respective page information.
- the image block data remained in the page comparison image area is page-unique image data. Therefore, the image block data of the same page is extracted to generate image data, and compression processing appropriate for the page image is performed. Then, the generated image data is pasted on the page image as a background of the binary image generated for each page in step S 210 .
- step S 105 image block pasting is performed with respect to all image blocks stored in the page comparison image area.
- common image link generation processing in step S 106 is performed.
- Numerals 2601 to 2603 in FIG. 26 show images of pasting processing on the page image. For each of the page-unique image data, image data pasting is performed as a background image of the binary image extracted as a foreground.
- step S 106 common image data is generated from the image block data stored in the common comparison image area and appropriate compression is performed. Since the common image data is multi-valued color image data, a compression technique such as GIF, JPEG and the like may be employed in the compression processing.
- the image data generated from the common comparison image area has no tangible data in each page. By generating a link from one tangible data to respective pages, the page image is generated.
- the image data is complemented by performing filling processing with the peripheral color similarly to the deleted text portion filling processing in step S 207 .
- partially common image data and all-page common image data are generated.
- the partially common image data is pasted as a background of the page image, and then the all-page common image data is superimposed as the background of all image data.
- image data is completed by superimposing images from the background to the foreground, i.e., in order of the all-page common image, the partially common image, the page image, and the binary image.
- a file having a structured document format that can describe plural pages of document (e.g., a PDF file) is generated, and an image file shown in FIG. 28 can be acquired as an output result.
- step Sl 06 of the first embodiment all-page common image data that is common to all pages and partially common image data that is partially common to part of the pages are generated. Then, generated common image data is linked to corresponding pages to generate one page of image data.
- the second embodiment pays attention to the partially common image data of the common image block data.
- image data With respect to image block data that is common in over 50% of all pages, image data is generated as the all-page common image block.
- first-second page common image data and second-third page common image data shown in FIG. 27 which are the partially common image data, include image data that is common in over 50% of all pages (3 pages). Therefore, these partially common image data can be processed as the all-page common image data.
- image data having links to respective pages is only the common image data as shown in FIG. 29 . Therefore, the second embodiment can increase the compression efficiency for the size of the partial image data, compared to the first embodiment. The larger the number of pages to be processed, the more increased effect can be achieved.
- image block data includes image block addresses, image data, and page data.
- image block data contains information on whether or not filling processing of the foreground has been performed on the image block.
- processing in step S 307 is performed.
- the image block is registered in the common comparison image area as common image data.
- an image block which has not been subjected to filling is registered by priority as image data of the image block data.
- FIG. 30 is an explanatory view of processing according to the third embodiment.
- the image block 2 having a small filling rate is registered as the image block data.
- the image block 3 having a small filling rate is registered as the image block data.
- the trace of filling remains to some extent.
- the processing of the third embodiment since the image of the ultimately acquired image block has the smallest filling rate (close to the original background), the trace of filling does not become conspicuous. Therefore, when the first embodiment is compared with the third embodiment, executing the processing according to the third embodiment can achieve a higher quality image.
- image block division is performed in units of 16 ⁇ 16 pixels and comparison is performed.
- the comparison result finds the image blocks different. In an extreme case, if 1 out of 256 pixels is different in the 16 ⁇ 16 pixel block, the image blocks are considered different.
- the fourth embodiment adds processing of further segmentalizing the image blocks and comparing the blocks again.
- the similarity level of image blocks 1 and 2 is 70%
- the similarity level of image blocks 1 and 3 is 70%
- the similarity level of image blocks 2 and 3 is 80%.
- the threshold value of the same-image-block determination is 90%
- the image blocks 1 , 2 , and 3 are all recognized as different images.
- the image blocks are further segmentalized to determine whether or not they are the same image block.
- the pixel block having the size of 16 ⁇ 16 pixels is segmentalized to image blocks having 8 ⁇ 8 pixels, and each of the segmentalized image blocks is reevaluated.
- segmentalized block regions 1 , 2 , and 3 of the pixel block 2 can be determined the same as the corresponding regions 1 , 2 , and 3 of the pixel block 1 . Therefore, only the region 4 is stored with respect to the pixel block 2 .
- segmentalization processing is further segmentalized to the level of pixel unit, only the portion that has failed in filling processing is extracted as a page image as shown in FIG. 32 , and remaining image data can be stored as common image data.
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Compression Of Band Width Or Redundancy In Fax (AREA)
Abstract
Description
- 1. Field of the Invention
- The present invention relates to a technique of performing compression coding on a document image constructed with plural pages, and generating one file.
- 2. Description of the Related Art
- In recent years, electronic image data read and acquired by a scanner from an original paper document is kept (stored) in a computer and reused.
- However, image data (particularly color image data) has a large data size. In the case in which image data is stored on a server provided on a network, a network traffic problem occurs. Furthermore, to store the data, a large-capacity storage device is required.
- To address these issues, a number of image data compression techniques have been proposed. If image data is compressed at an extremely high compression rate, image quality deteriorates significantly, causing a problem in which the text portion that is considered important in many business documents becomes illegible.
- To solve this problem, the following techniques are known: the technique of changing the compression method in accordance with a characteristic or attribute of the image data (e.g., Japanese Patent Application Laid-Open No. 10-51642 and No. 10-215379), and the technique of extracting layer structures of an image and performing efficient compression for each of the layers (Japanese Patent Application Laid-Open No. 2000-306103).
- According to these techniques, one sheet of document image data can be compressed to image data of a small data size, while maintaining reasonable image quality. Assume that one page of uncompressed image data can be compressed as small the amount as 1/10. In this case, a storage device having a certain capacity can store and manage 10 times as large the document volume. Although the compression rate can further be increased, an extremely high compression rate results in unclear text in a decoded document and makes the texts difficult to read.
- According to the techniques proposed so far, document images are stored and managed in units of sheet and are independent of each other, thus having no cause-and-effect relationship.
- Meanwhile, a document prepared for presentation used in an office conference or the like is often constructed with plural pages, and many of them are generated by a PC. In the case of such documents prepared for a presentation, a background image is often provided to each page for better appearance.
- The inventor of the present invention has focused attention to the point that the background images of respective pages are highly likely the same in a case of such document prepared for presentation. The present invention takes advantage of this feature of the document having the same background image, and provides a technique for further improving the compression rate while maintaining high image quality.
- To solve the above-described problem, an image processing apparatus according to the present invention has, for instance, the following configuration. More specifically, the present invention provides an image processing apparatus that performs compression coding on image data and stores the image data as an electronic file, comprises
- an input unit adapted to input image data corresponding to plural pages,
- a first separation unit adapted to separate the inputted image data of each page into a text image and a background image,
- a first compression coding unit adapted to perform binarization on the separated text image and perform compression coding for binary images,
- a second separation unit adapted to separate the background image into a common background image portion and a page-unique background image portion by comparing the separated background images of respective pages,
- a second compression coding unit adapted to perform compression coding for multi-valued images on the common background image portion and the page-unique background image portion, and
- a file generation unit adapted to generate one electronic file that can describe plural pages, by associating each page with respective coded data of a page-unique text portion and a page-unique background image portion, and by associating the common background image portion of respective pages with a link to coded data of one common background image.
- According to the present invention, when a file is generated from image data having plural pages, coded data of a background portion that is common to plural pages is generated and shared. Accordingly, a file compressed at high compression rate can be generated while maintaining high image quality.
- Further features of the present invention will become apparent from the following description of exemplary embodiments with reference to the attached drawings.
- The accompanying drawings, which are incorporated in and constitute a part of the specification, illustrate embodiments of the invention and, together with the description, serve to explain the principles of the invention.
-
FIG. 1 is a block diagram showing a construction of a data processing apparatus according to an embodiment of the present invention; -
FIG. 2 is a diagram showing an example of a network system to which the embodiment is applicable; -
FIG. 3 is a flowchart showing the main processing of the data processing apparatus according to the embodiment; -
FIG. 4 is a flowchart of the page processing shown inFIG. 3 ; -
FIG. 5 is a flowchart of the comparison processing shown inFIG. 4 ; -
FIG. 6 is a view showing binary image generation processing based on original image data according to the embodiment; -
FIG. 7 is an explanatory view of isolated pixel removing processing according to the embodiment; -
FIG. 8 is an explanatory view of image region division processing from a binarized result according to the embodiment; -
FIGS. 9A and 9B show a tree structure where respective nodes have data of respective regions generated by image region division, and a data content of each node; -
FIG. 10 is an explanatory view of non-text image deletion processing according to the embodiment; -
FIG. 11 is an explanatory view of text color decision processing according to the embodiment; -
FIG. 12 is an explanatory view of text deletion processing according to the embodiment; -
FIGS. 13A to 13C are explanatory views of text image deleted portion filling (painting) processing according to the embodiment; -
FIGS. 14A and 14B are views showing a background image and a text image separated from each other; -
FIG. 15 is a view showing a background image which is divided into blocks; -
FIG. 16 is a view showing how block images on the first page are stored; -
FIGS. 17A to 17D are views showing text images separated in units of color; -
FIG. 18 is a view showing a binarized result of a text image; -
FIG. 19 is an explanatory view of a background image and a text image of the second page separated from each other; -
FIG. 20 is a view showing a background image of the second page which is divided into blocks; -
FIG. 21 is an explanatory view of pixel block comparison processing between the first page and second page; -
FIG. 22 shows an arithmetic expression of a pixel block similarity level; -
FIGS. 23A and 23B are views showing how pixel blocks of a page-unique portion and a common portion in a background image are stored upon completion of second page processing; -
FIG. 24 is an explanatory view of a background image and a text image of the third page separated from each other; -
FIGS. 25A and 25B are views showing how pixel blocks of a page-unique portion and a common portion in a background image are stored upon completion of third page processing; -
FIG. 26 is a view showing how the page-unique portion is pasted on the first to third pages respectively; -
FIG. 27 is a view showing how the common background portion is pasted on the first to third pages respectively; -
FIG. 28 is a view showing restored images of the first to third pages; -
FIG. 29 shows an example of the common background image according to the second embodiment; -
FIG. 30 is an explanatory view of superimposing a binary image on a background image according to the third embodiment; -
FIG. 31 is an explanatory view of pixel block segmentalization and comparison processing according to the fourth embodiment; and -
FIG. 32 shows how segmentalization is applied in the fourth embodiment. - Preferred embodiments of the present invention will now be described in detail in accordance with the accompanying drawings.
-
FIG. 1 is a block diagram showing a construction of adata processing apparatus 100 according to the present embodiment. Assume that thedata processing apparatus 100 according to the present embodiment has a hardware configuration of a general-purpose data processing apparatus, e.g., a personal computer (PC). - In
FIG. 1 ,numeral 1 denotes a CPU for controlling the entire apparatus; 2, ROM for storing a boot program and BIOS; and 3, RAM used as a work area of theCPU 1. In the RAM, an operating system (OS), various processing programs, and furthermore data are read and processed by theCPU 1.Numeral 4 denotes a hard disk device (HDD) serving as an external storage device. As shown inFIG. 1 , anOS 4 a and aprocessing program 4 b according to the present embodiment are installed in theHDD 4. TheHDD 4 also stores data files generated by various application programs and image data files inputted by an input device, such as a scanner, which will be described later.Numeral 5 denotes a keyboard; and 6, a pointing device such as a mouse. Numeral 7 denotes a video memory and a display controller which performs rendering processing on the video memory, image data reading from the video memory at predetermined timing, and video signal output to adisplay device 8. Thedisplay device 8 may be of a CRT, a liquid crystal display device or the like. Numeral 9 denotes a scanner interface (I/F) for connecting animage scanner 10 that reads an original document. Typical examples of the scanner I/F 9 include an SCSI interface, a USB interface, and the like. Assume that theimage scanner 10 comprises an auto-document feeder (ADF) that has a function for sequentially conveying plural sheets of document to a reading position and reading the document.Numeral 11 denotes a network interface (I/F) such as an Ethernet. - In the above-described construction, when the power of the
data processing apparatus 100 is turned on, theCPU 1 executes the boot program stored in theROM 2 and loads the OS from theHDD 4 to theRAM 3 to function as a data processing apparatus. Thereafter, when a user operates thekeyboard 5 and thepointing device 6 to start theprocessing program 4 b (load the program in theRAM 3 and execute), the apparatus functions as a device that performs image data compression processing according to the present embodiment. - When the
processing program 4 b is executed, an original document having plural pages set on theimage scanner 10 are sequentially read in accordance with a user's instruction, and the read image data is stored in theRAM 3. Then, processing that will be described later is performed on image data of the respective pages, thereby generating one file corresponding to the original document having plural sheets, and the file is stored in theHDD 4. - Note that although the above description provides an example of a stand-alone apparatus, the image scanner and the apparatus (a database server or a file server) where the generated compressed data file is ultimately stored may be provided on a network.
FIG. 2 shows an example thereof. - In
FIG. 2 , thedata processing apparatus 101 causes thescanner 200 on the network to read an original document image, and receives the read image data. Thedata processing apparatus 101 sequentially transfers the read image data to adata processing apparatus 102 through arouter 300. Thedata processing apparatus 102 executes the above-described compression processing on the received image data, and transfers the data file generated as a result of compression to adata processing apparatus 103. Thedata processing apparatus 103, which serves as a database server or a file server, stores and keeps the transferred data file. In the construction shown inFIG. 2 , thedata processing apparatus 102 performs the processing similar to that ofFIG. 1 . As can be easily understood from the construction inFIG. 2 , this configuration can be realized in the environment where plural LANs are connected, thus can be realized through, e.g., the Internet. - To simplify the description, the following description will be given on the construction shown in
FIG. 1 . However, as described with reference toFIG. 2 , it is apparent that the processing can be executed through a network. - Hereinafter, the processing content of the
processing program 4 a executed by thedata processing apparatus 100 according to the present embodiment is described with reference to the flowchart inFIG. 3 . Note that this processing is executed when theprocessing program 4 b (application) is started and original document reading is instructed by thepointing device 6 on the GUI screen (not shown) provided by the program. - In step S101, one of the plural sheets of original document set on the
scanner 10 is read. The pixels constituting the page image data of the read one sheet of original document are multi-valued color image data where R, G and B are respectively expressed by 8 bits. In this embodiment, the image data expressed in RGB color space is converted to YCbCr (luminance, color difference) space. Since the conversion that is realized by matrix operation is well-known, description thereof is omitted. - In step S102, page processing is executed on the read page image data. The page processing is described in detail with reference to the flowchart in
FIG. 4 . - In step S201, color image data expressed in YCbCr space is binarized, thereby generating binary image data.
-
FIG. 6 is an explanatory view of binary image generation processing based on color image data. Y values representing a luminance component are extracted from the inputted original image data in YCbCr space, and ahistogram 501 is generated. Since the histogram represents the luminance component, the number of pixels constituting texts and line drawings is by far smaller than the number of pixels constituting the background image. Further, since the background image has a higher luminance (lower density) level than that of texts and line drawings, when the distribution line of the histogram is followed from the highest luminance level to a low luminance level, the value that first goes below the threshold value T is acquired as a local minimal value, and the luminance of the local minimal value is decided as a threshold value used in binarization. - Pixels equal to or lower than the binarization threshold value decided in the foregoing manner are processed as black pixels, and pixels higher than the threshold value are processed as white pixels, thereby obtaining the
binary image data 502 shown inFIG. 6 . - In step S203, noise removal processing is performed on the generated binary image data, thereby producing image data that is subjected to block selection. The noise removal processing is equivalent to isolated pixel removing. This is described with reference to
FIG. 7 . Note that each pixel of the binary image data is expressed by 1 bit. For descriptive purpose, 1 expresses a black pixel and 0 expresses a white pixel herein. - Assume that the pixel of interest is P(x, y). The condition of an isolated pixel is as follows:
- condition: P(x, y)=“1” and peripheral 8 pixels {P(x, y−1), P(x, y+1), P(x−1, y−1), P(x−1, y), P(x−1, y+1), P(x+1, y−1), P(x+1, y), P(x+1, y+1)} are all “0”.
- When the above condition is satisfied, the pixel of interest P(x, y) is determined as an isolated pixel, and the value of the pixel of interest P(x, y) is converted to “0”.
- When the pixel of interest is numeral 700 in
FIG. 7 , the pixel is determined as an isolated pixel and is converted to a white pixel. When the pixel of interest is numeral 701, since one of the peripheral pixels, i.e., the neighboring pixel on the right is “1”, the pixel ofinterest 701 is determined not to be an isolated pixel, and the original binary data is maintained. - The above processing is sequentially performed with respect to the pixel of interest P(x, y) by updating x and y in accordance with raster scanning, thereby executing binarization based on the luminance component of the inputted image.
- In step S203, the binary image data, on which above-described noise removal processing has been performed, is inputted, then block selection processing (image region division processing that divides the image into blocks in units of attribute) is performed, and block information and attribute information are added to the image data before being outputted.
- In block selection processing, an attribute is given to the image region based on the size of the black pixel group included in the binary image data and uniformity of the image block. In this manner, from the page data of the binary image, it is possible to obtain somewhat cohesive image blocks consisting of titles, texts, graphics, tables, and images (pictures), as well as its attribute, positional information and size of the block.
-
FIG. 8 shows the result of image region division using block selection. InFIG. 8 , the inputted image data is divided into block regions including eight text regions Text 1-8, one table region Table 1, and onepicture region Picture 1, as well as three line regions Line 1-3. -
FIG. 9A shows a logical structure (tree structure) obtained as a result of block selection processing. With the image page being a route, processing for text images is performed on all block data obtained herein. As shown inFIG. 9B , each node includes attribute information indicative of whether the node is a text region or a table region, the coordinate position (x, y) of the top left corner of the region, and the width and height of the region given that the region is rectangular. With respect to a text region, in order to manage the region having a text image as a child node, each node includes a pointer to the child node, and furthermore, a pointer to the next node. Since the table region also includes texts, it has a similar structure. - In step S204, one region is extracted and it is determined whether or not the region is a text region. This is determined by referring to the node attribute in the logical structure of the block selection shown in
FIG. 9B . - As a result of determination, if it is determined as a text region, processing in step S205 is performed. If it is determined as a non-text region, processing in step S204-2 is performed. In this embodiment, since the attribute of the first node (left end node) is a line as shown in
FIG. 9A , the region is determined as a non-text region, and processing in step S204-2 is performed. -
FIG. 10 is an explanatory view of non-text image deletion from a binary image. Based on node data of the block selection, an image region to be deleted is specified, and the specified image is deleted from the binary image data. - When processing on the target block is completed, step S208 is executed and a node of the next block selection is acquired. If node data cannot be acquired, in other words, if processing completion on all blocks is determined, processing in step S209 is executed. Herein assume that there is a next node, and a description is provided on a case where the control returns to step S204.
- In step S204, it is again determined whether the node attribute of the block selection subjected to processing is a text region or a non-text region. If the node of interest is a text region, processing in step S205 is executed.
- In step S205, node data of the block selection in a text region is analyzed to acquire text node data which is a final unit of a text block. Color data is added to the acquired text data.
- In text color determination processing, as shown in
FIG. 11 , color data of each pixel constructing the text corresponding to the node data subjected to current processing is extracted from the original YCbCr image data, and a text color is determined and added to the node data. - In the case of
FIG. 11 , since the color of the target text is black, text color data [0, 128, 128] (values of Y, Cb, Cr) is added to the node data. - In step S206, based on information corresponding to the node that is determined as a text in step S205, each pixel constructing the target text image is deleted from the original YCbCr color image.
FIG. 12 is an explanatory view of text image deletion. Pixel data corresponding to the pixel position having “1” in the binary image is deleted from the YCbCr color image. - In step S207, filling (painting the blank in the deleted portion) is performed on the color image where text image is deleted, using the color data of peripheral pixels. As shown in
FIG. 13A , if the known number of colors on the periphery of the filling-target pixel (pixel value (?, ?, ?)) is one, filling (painting) is performed with the same color as the peripheral pixels. If there are N types of colors in the neighboring pixels (two types inFIG. 13B ), an average value of the color components is calculated to determine the color of the filling-target pixel. In this manner, the color of the pixels where the text is deleted is filled with the peripheral color, i.e., the background color. - In the above-described manner, the processing of steps S204 to S208 is performed with respect to all the nodes. For a node having text attribute, the text color is determined, then the text image is deleted from the color image, and filling is performed with a peripheral color. For a node having non-text attribute, image data is deleted based on the binary image. Finally when processing on the page image is performed, a color image and a binary image shown in
FIGS. 14A and 14B can be obtained. Texts are deleted from the color image, and filling is performed on the deleted text pixels. Image data other than texts is deleted from the binary image. - In step S209, comparison processing is performed between the color image where text removal and filling have been performed and the color image of the previous page where the similar processing has been performed.
- Hereinafter, the color image comparison processing is described according to the flowchart in
FIG. 5 . - <Processing on First Page>
- In step S301, color image data of the target page, on which text removal and filling have been performed, is divided into blocks of an MCU (Minimum Coded Unit), which are the comparison unit. In this embodiment, assume that the MCU has a size of 16×16 pixel block. As shown in
FIG. 15 , image block addresses (1, 1) to (m, n) are assigned to respective blocks. Each of the divided blocks is the comparison unit. - In step S302, it is determined whether or not there is comparison-target image data. If there is no comparison-target image block data in the comparison image area, image processing in step S303 is performed. Assuming that the target page is the first page, since there is no comparison target, processing of step S303 is performed. Note that, besides the first page, if it is determined that there is no comparison target, the data in the comparison image area might have been flushed for some reason (e.g., lack of memory).
- In step S303, all data of the target page is stored as a page image in the page comparison image area (secured in the RAM), and the control ends.
- The page comparison image area has areas for storing addresses of respective block images, image data, page data and so forth as shown in
FIG. 16 . - Referring back to
FIG. 4 , in step S210, the text image is compressed. The text image compression is performed after a binary image is generated for each text color. -
FIGS. 17A to 17D are explanatory views of image data generation in units of text color. Based on text color data of each node acquired in block selection, the same or similar text colors are put together as one text color, and binary image data is generated in units of color. The number of approximating colors is statistically about 16 at most. - As a result, binary image data of respective colors can be obtained as shown in
FIGS. 17A to 17D. Each of these images is subjected to compression by MMR processing, which is a compression method appropriate for binary image data, and the page image as shown inFIG. 18 is stored as binary image data of the first page. - When page processing described above is completed, it is determined in step S103 in
FIG. 3 if there is a next page. Since there is still a page, image data of the second page is read and the processing similar to the first page is performed. - <Processing on Second Page>
-
FIG. 19 shows an inputted image of the second page as well as a binary image and a background image obtained as a result of processing in steps S201 to S208. - In step S301, the inputted background image is divided. Similarly to the first page, the image is divided into MCU blocks.
- The image data is divided in units of 16×16 pixels as shown in
FIG. 20 . Image block addresses (1, 1) to (m, n) are assigned to the divided blocks respectively, and comparison processing is performed in units of the divided block. - In step S302, it is determined whether or not there is a comparison-target image block in the comparison image area. Since the currently processed image is a second page, the page comparison image area contains image block data obtained in the previous processing. Therefore, comparison-base image block acquisition processing in step S304 is performed.
- In step S304, the image block of the second page that serves as a comparison base, is acquired. When processing is completed on all image blocks of the comparison base, the comparison processing ends.
- In step S304-2, image data to be compared with the block data of the original image is extracted from the common comparison image area and page comparison image area.
-
FIG. 21 is an explanatory view of comparison image block acquisition processing. As shown inFIG. 21 , all image blocks having the same address as the comparison-base block address are extracted as comparison target data. - In step S305, the image blocks are compared to calculate the similarity level.
- First, components Y, Cb and Cr constituting the pixel are extracted from the image block, and a similarity level is calculated for each component.
FIG. 22 shows a similarity level calculation formula used for calculating the similarity level of the image blocks. Using the similarity level calculation formula where a similarity level is expressed by a vector's cosine value, the similarity level between the comparison-base image block and the comparison-target image block is calculated in units of pixel component. - In step S306, determination is made as to whether or not to consider the comparing image blocks the same image. Threshold values are set respectively for similarity levels of respective components Y, Cb and Cr. For instance, similarity level 95 or more is set as a threshold value of luminance data Y; and similarity level 90 is set as a threshold value of color difference data Cb and Cr.
- When the similarity level of the compared image block data is equal to or larger than the threshold value, the image block is stored as the same image in the common image area in step S307. When the similarity level is less than the threshold value, the image block is stored as a difference image in the page comparison image area in step S308.
- In step S307, the image block having a similarity level equal to or larger than the threshold value is stored in the common comparison image area.
- If the comparison-target image block exists in the common comparison image area, page data is added to the page data of the corresponding image block address of the common comparison image area. If the comparison-target image block exists in the page comparison image area, the image block data in the page comparison image area is moved to the common comparison image area, and page data of the current comparison-base image block is added.
- In step S308, with respect to the image block having a similarity level less than the threshold value, data of the currently processed image block is stored in the page comparison image area.
- As has been described above, when the second page is inputted, the image blocks of the first page have all been stored in the page comparison image area. Therefore, processing of steps S304 to S308 is repeated with respect to the image blocks of the inputted second page and the image blocks stored in the comparison image area. As a result, image block data can be stored in the page comparison image area and the common comparison image area shown in
FIGS. 23A and 23B . - When the above-described image data comparison processing of the second page is completed, binary image data is generated in step S210 similarly to the first page, thereby registering the second-page image data.
- <Processing On Third Page and Subsequent Pages>
- With respect to the third page and the subsequent pages, basically the similar processing to that of the second page is performed.
-
FIG. 24 shows image data of the inputted image of the third page, in which the text region that is the foreground is separated from the background. - Similarly to the data on the second page, the inputted image is divided into image blocks, and the image block data is compared to block data in the comparison image area to calculate the similarity level.
- Similarly to the second page, if the similarity level is equal to or larger than the threshold value, the image block data is stored in the common comparison image area; whereas if the similarity level is less than the threshold value, the image block data is stored in the page comparison image area.
- In this stage, even if the image block data is not common in all pages, the image block data is still registered in the common comparison image area. For instance, in a case where the image block data is common in the second and third pages but not in the first page, the image block data of the second and third pages is registered in the common comparison image area and the image block data of the first page is left in the page comparison image area.
- As described above, as a result of processing on the third-page image data, image block data is stored in the comparison image areas shown in
FIGS. 25A and 25B . - When image data comparison processing on the third page is completed, binary image data is generated in step S210 as similar to the first and second pages, and the generated binary image data is registered as the third-page image data.
- By repeating the foregoing processing a number of times corresponding to the number of pages, similar processing can be performed on plural pages of image data.
- In step S103 in
FIG. 3 , it is determined based on the status signal from theimage scanner 10 whether or not data indicative of no unread original document is received. If comparison processing is completed for all pages, the control proceeds to step S104. - In step S104, the image block data stored in the page comparison image area is added to the page data as respective page information.
- The image block data remained in the page comparison image area is page-unique image data. Therefore, the image block data of the same page is extracted to generate image data, and compression processing appropriate for the page image is performed. Then, the generated image data is pasted on the page image as a background of the binary image generated for each page in step S210.
- In step S105, image block pasting is performed with respect to all image blocks stored in the page comparison image area. When pasting of all image blocks is completed, common image link generation processing in step S106 is performed.
-
Numerals 2601 to 2603 inFIG. 26 show images of pasting processing on the page image. For each of the page-unique image data, image data pasting is performed as a background image of the binary image extracted as a foreground. - In step S106, common image data is generated from the image block data stored in the common comparison image area and appropriate compression is performed. Since the common image data is multi-valued color image data, a compression technique such as GIF, JPEG and the like may be employed in the compression processing.
- Then, pasting is performed as a background of the page image data generated in step S104. The image data generated from the common comparison image area has no tangible data in each page. By generating a link from one tangible data to respective pages, the page image is generated. When the common image data has an absent portion, the image data is complemented by performing filling processing with the peripheral color similarly to the deleted text portion filling processing in step S207.
- As shown in
FIG. 27 , partially common image data and all-page common image data are generated. The partially common image data is pasted as a background of the page image, and then the all-page common image data is superimposed as the background of all image data. - In other words, image data is completed by superimposing images from the background to the foreground, i.e., in order of the all-page common image, the partially common image, the page image, and the binary image.
- Based on the data obtained by the above processing, a file having a structured document format that can describe plural pages of document (e.g., a PDF file) is generated, and an image file shown in
FIG. 28 can be acquired as an output result. - As has been described above, in a case of an original document constructed with plural pages, generating links to respective pages having a common background enables data sharing and further enables reduction in the data amount.
- In the common background link generation processing in step Sl06 of the first embodiment, all-page common image data that is common to all pages and partially common image data that is partially common to part of the pages are generated. Then, generated common image data is linked to corresponding pages to generate one page of image data.
- The second embodiment pays attention to the partially common image data of the common image block data. With respect to image block data that is common in over 50% of all pages, image data is generated as the all-page common image block.
- For instance, first-second page common image data and second-third page common image data shown in
FIG. 27 , which are the partially common image data, include image data that is common in over 50% of all pages (3 pages). Therefore, these partially common image data can be processed as the all-page common image data. - Therefore in the case of the second embodiment, image data having links to respective pages is only the common image data as shown in
FIG. 29 . Therefore, the second embodiment can increase the compression efficiency for the size of the partial image data, compared to the first embodiment. The larger the number of pages to be processed, the more increased effect can be achieved. - In the first embodiment, image block data includes image block addresses, image data, and page data.
- According to the third embodiment, in addition to the above data, image block data contains information on whether or not filling processing of the foreground has been performed on the image block. When it is determined that a comparison-target image block is the same image block as a comparison-base image block, processing in step S307 is performed. In step S307, the image block is registered in the common comparison image area as common image data. In this stage, an image block which has not been subjected to filling is registered by priority as image data of the image block data.
- Furthermore, by containing information on a pixel rate at which filling processing is performed on the image block, image data having a small pixel filling rate is registered in the image block registration in step S307.
FIG. 30 is an explanatory view of processing according to the third embodiment. - For instance, in a case where the image blocks 1 and 2 are processed as the same image, the
image block 2 having a small filling rate is registered as the image block data. Further in a case where the registered image block and theimage block 3 are processed as the same image, theimage block 3 having a small filling rate is registered as the image block data. - In the processing according to the first embodiment, in a case of superimposing the text image “B” on the background image that has been subjected to filling, the trace of filling remains to some extent. On the contrary, according to the processing of the third embodiment, since the image of the ultimately acquired image block has the smallest filling rate (close to the original background), the trace of filling does not become conspicuous. Therefore, when the first embodiment is compared with the third embodiment, executing the processing according to the third embodiment can achieve a higher quality image.
- In the image data division processing according to the first embodiment, image block division is performed in units of 16×16 pixels and comparison is performed.
- For instance, when the image blocks 1, 2 and 3 shown in
FIG. 31 are compared, the comparison result finds the image blocks different. In an extreme case, if 1 out of 256 pixels is different in the 16×16 pixel block, the image blocks are considered different. - In view of this, even in a case where the image block comparison processing does not find image blocks the same, if the image blocks have a certain similarity, the fourth embodiment adds processing of further segmentalizing the image blocks and comparing the blocks again.
- In
FIG. 31 , assume that the similarity level of image blocks 1 and 2 is 70%, the similarity level of image blocks 1 and 3 is 70%, and the similarity level of image blocks 2 and 3 is 80%. Assuming that the threshold value of the same-image-block determination is 90%, the image blocks 1, 2, and 3 are all recognized as different images. - According to the fourth embodiment, if the similarity level of image blocks is, e.g., 70% or more, the image blocks are further segmentalized to determine whether or not they are the same image block.
- For instance, as shown in
FIG. 31 , the pixel block having the size of 16×16 pixels is segmentalized to image blocks having 8×8 pixels, and each of the segmentalized image blocks is reevaluated. As a result,segmentalized block regions pixel block 2 can be determined the same as the correspondingregions pixel block 1. Therefore, only theregion 4 is stored with respect to thepixel block 2. - As described above, in the image block comparison processing that finds similar image blocks different according to the first embodiment, ¾ of the blocks are recognized as common images according to the fourth embodiment. As a result, recognition rate improves.
- If the segmentalization processing is further segmentalized to the level of pixel unit, only the portion that has failed in filling processing is extracted as a page image as shown in
FIG. 32 , and remaining image data can be stored as common image data. - Preferred embodiments of the present invention have been provided above. Many of the processing described in the embodiments are realized by a computer program executed on a data processing apparatus. Naturally, the present invention includes such computer program. Furthermore, normally the computer program is stored in a computer-readable storage medium such as CD-ROM, and becomes executable by setting the storage medium in a computer, then copying or installing to the system. Therefore, the present invention also includes such computer-readable storage medium.
- While the present invention has been described with reference to exemplary embodiments, it is to be understood that the invention is not limited to the disclosed exemplary embodiments. The scope of the following claims is to be accorded the broadest interpretation so as to encompass all such modifications and equivalent structures and functions.
- This application claims the benefit of Japanese Patent Application No. 2005-347934 filed Dec. 1, 2005, which is hereby incorporated by reference herein in its entirety.
Claims (8)
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2005347934A JP4817821B2 (en) | 2005-12-01 | 2005-12-01 | Image processing apparatus, control method therefor, computer program, and computer-readable storage medium |
JP2005-347934 | 2005-12-01 |
Related Child Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US11/772,085 Continuation US8125669B2 (en) | 2006-11-30 | 2007-06-29 | Systematic approach to uncover GUI logic flaws |
Publications (2)
Publication Number | Publication Date |
---|---|
US20070127043A1 true US20070127043A1 (en) | 2007-06-07 |
US8319987B2 US8319987B2 (en) | 2012-11-27 |
Family
ID=38134703
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US11/564,426 Expired - Fee Related US8319987B2 (en) | 2005-12-01 | 2006-11-29 | Image processing apparatus and control method for compressing image data by determining common images amongst a plurality of page images |
Country Status (2)
Country | Link |
---|---|
US (1) | US8319987B2 (en) |
JP (1) | JP4817821B2 (en) |
Cited By (13)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20060204111A1 (en) * | 2005-03-08 | 2006-09-14 | Fuji Xerox Co., Ltd. | Translated document image production device, recording medium and translated document image production method |
US20080074685A1 (en) * | 2006-09-25 | 2008-03-27 | Konica Minolta Business Technologies, Inc. | Image processing apparatus, image processing method, and computer readable recording medium stored with image processing program |
US20090002762A1 (en) * | 2007-06-28 | 2009-01-01 | Konica Minolta Business Technologies,Inc. | Image processing apparatus, computer readable recording medium stored with image processing program, and image processing method |
US20090316217A1 (en) * | 2008-06-18 | 2009-12-24 | Canon Kabushiki Kaisha | Image processing device, image processing method and computer readable medium |
US20100079781A1 (en) * | 2008-10-01 | 2010-04-01 | Canon Kabushiki Kaisha | Document processing system and control method thereof, program, and storage medium |
US20100082709A1 (en) * | 2008-10-01 | 2010-04-01 | Canon Kabushiki Kaisha | Document processing system and control method thereof, program, and storage medium |
US20100156919A1 (en) * | 2008-12-19 | 2010-06-24 | Xerox Corporation | Systems and methods for text-based personalization of images |
US20110040735A1 (en) * | 2009-08-13 | 2011-02-17 | Hon Hai Precision Industry Co., Ltd. | System and method for compressing files |
US20110075170A1 (en) * | 2009-09-30 | 2011-03-31 | Kyocera Mita Corporation | Image processing apparatus and image forming apparatus using same |
US20130242160A1 (en) * | 2012-03-14 | 2013-09-19 | Casio Computer Co., Ltd. | Image processing apparatus capable of specifying positions on screen |
CN104243768A (en) * | 2013-06-19 | 2014-12-24 | 夏普株式会社 | IMAGE PROCESSING APPARATUS and IMAGE FORMING APPARATUS |
US20150379750A1 (en) * | 2013-03-29 | 2015-12-31 | Rakuten ,Inc. | Image processing device, image processing method, information storage medium, and program |
US20160217117A1 (en) * | 2015-01-27 | 2016-07-28 | Abbyy Development Llc | Smart eraser |
Families Citing this family (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US8326051B1 (en) | 2008-02-22 | 2012-12-04 | Teradici Corporation | Method and apparatus for progressive encoding for text transmission |
JP5328510B2 (en) | 2009-06-24 | 2013-10-30 | キヤノン株式会社 | Image processing apparatus, image processing method, and computer program |
JP2012205181A (en) * | 2011-03-28 | 2012-10-22 | Fuji Xerox Co Ltd | Image processing device and program |
Citations (19)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6301386B1 (en) * | 1998-12-09 | 2001-10-09 | Ncr Corporation | Methods and apparatus for gray image based text identification |
US20020037100A1 (en) * | 2000-08-25 | 2002-03-28 | Yukari Toda | Image processing apparatus and method |
US20030035147A1 (en) * | 2001-08-15 | 2003-02-20 | Eastman Kodak Company | Authentic document and method of making |
US20030107753A1 (en) * | 2001-12-06 | 2003-06-12 | Yoichi Sakamoto | Image processing apparatus and method, program, and storage medium |
US6587583B1 (en) * | 1999-09-17 | 2003-07-01 | Kurzweil Educational Systems, Inc. | Compression/decompression algorithm for image documents having text, graphical and color content |
US6628833B1 (en) * | 1999-06-30 | 2003-09-30 | Minolta Co., Ltd. | Image processing apparatus, image processing method, and recording medium with image processing program to process image according to input image |
US20030210803A1 (en) * | 2002-03-29 | 2003-11-13 | Canon Kabushiki Kaisha | Image processing apparatus and method |
US20040071478A1 (en) * | 2002-10-10 | 2004-04-15 | Kohji Katamoto | Image forming apparatus |
US20040136586A1 (en) * | 2002-07-29 | 2004-07-15 | Yukihiro Okamura | Apparatus and method for processing images of negotiable instruments |
US20040205081A1 (en) * | 2003-04-10 | 2004-10-14 | Hui Chao | Method and apparatus for classifying elements of a document |
US20050050452A1 (en) * | 2003-08-27 | 2005-03-03 | Weitzel Wade D. | Systems and methods for generating an electronically publishable document |
US20050141035A1 (en) * | 2003-12-04 | 2005-06-30 | Xerox Corporation | System and method for processing portions of documents using variable data |
US20050172225A1 (en) * | 2004-01-30 | 2005-08-04 | Canon Kabushiki Kaisha | Document processing apparatus, document processing method, and document processing program |
US20050180648A1 (en) * | 2004-02-12 | 2005-08-18 | Xerox Corporation | Systems and methods for adjusting image data to form highly compressible image planes |
US20050190981A1 (en) * | 2004-02-26 | 2005-09-01 | Xerox Corporation | System for recording image data from a set of sheets having similar graphic elements |
US20060045357A1 (en) * | 2004-08-25 | 2006-03-02 | Schwartz Edward L | Multi-resolution segmentation and fill |
US7010745B1 (en) * | 1999-07-01 | 2006-03-07 | Sharp Kabushiki Kaisha | Border eliminating device, border eliminating method, and authoring device |
US20060072830A1 (en) * | 2004-02-26 | 2006-04-06 | Xerox Corporation | Method for automated image indexing and retrieval |
US20070065040A1 (en) * | 2005-09-22 | 2007-03-22 | Konica Minolta Systems Laboratory, Inc. | Photo image matching method and apparatus |
Family Cites Families (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH1051642A (en) | 1996-07-31 | 1998-02-20 | Fuji Xerox Co Ltd | Image processor |
JPH10215379A (en) | 1997-01-30 | 1998-08-11 | Fuji Xerox Co Ltd | Image coder and image decoder |
JP2000306103A (en) | 1999-04-26 | 2000-11-02 | Canon Inc | Method and device for information processing |
JP2002024799A (en) * | 2000-07-03 | 2002-01-25 | Minolta Co Ltd | Device, method and recording medium for image processing |
US6888968B1 (en) * | 2000-09-21 | 2005-05-03 | Kabushiki Kaisha Toshiba | Image processing apparatus and image processing method |
JP4678814B2 (en) * | 2001-09-04 | 2011-04-27 | キヤノン株式会社 | Image processing method and apparatus |
JP2003244448A (en) * | 2002-02-15 | 2003-08-29 | Canon Inc | Encoding method and decoding method |
US7206450B2 (en) * | 2002-04-25 | 2007-04-17 | Microsoft Corporation | Compression of bi-level images with explicit representation of ink clusters |
JP4047192B2 (en) * | 2003-02-24 | 2008-02-13 | キヤノン株式会社 | Image compression apparatus and method, image expansion apparatus and method, and program |
JP2005184114A (en) * | 2003-12-16 | 2005-07-07 | Canon Inc | Image processor, image processing method, program, and recording medium |
-
2005
- 2005-12-01 JP JP2005347934A patent/JP4817821B2/en not_active Expired - Fee Related
-
2006
- 2006-11-29 US US11/564,426 patent/US8319987B2/en not_active Expired - Fee Related
Patent Citations (25)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6301386B1 (en) * | 1998-12-09 | 2001-10-09 | Ncr Corporation | Methods and apparatus for gray image based text identification |
US6628833B1 (en) * | 1999-06-30 | 2003-09-30 | Minolta Co., Ltd. | Image processing apparatus, image processing method, and recording medium with image processing program to process image according to input image |
US7010745B1 (en) * | 1999-07-01 | 2006-03-07 | Sharp Kabushiki Kaisha | Border eliminating device, border eliminating method, and authoring device |
US6587583B1 (en) * | 1999-09-17 | 2003-07-01 | Kurzweil Educational Systems, Inc. | Compression/decompression algorithm for image documents having text, graphical and color content |
US20020037100A1 (en) * | 2000-08-25 | 2002-03-28 | Yukari Toda | Image processing apparatus and method |
US20030035148A1 (en) * | 2001-08-15 | 2003-02-20 | Eastman Kodak Company | Authentic document and method of making |
US20030035147A1 (en) * | 2001-08-15 | 2003-02-20 | Eastman Kodak Company | Authentic document and method of making |
US20030107753A1 (en) * | 2001-12-06 | 2003-06-12 | Yoichi Sakamoto | Image processing apparatus and method, program, and storage medium |
US20030210803A1 (en) * | 2002-03-29 | 2003-11-13 | Canon Kabushiki Kaisha | Image processing apparatus and method |
US20070127771A1 (en) * | 2002-03-29 | 2007-06-07 | Canon Kabushiki Kaisha | Image processing apparatus and method |
US20040136586A1 (en) * | 2002-07-29 | 2004-07-15 | Yukihiro Okamura | Apparatus and method for processing images of negotiable instruments |
US20040071478A1 (en) * | 2002-10-10 | 2004-04-15 | Kohji Katamoto | Image forming apparatus |
US20040205081A1 (en) * | 2003-04-10 | 2004-10-14 | Hui Chao | Method and apparatus for classifying elements of a document |
US20050050452A1 (en) * | 2003-08-27 | 2005-03-03 | Weitzel Wade D. | Systems and methods for generating an electronically publishable document |
US20050141035A1 (en) * | 2003-12-04 | 2005-06-30 | Xerox Corporation | System and method for processing portions of documents using variable data |
US20050172225A1 (en) * | 2004-01-30 | 2005-08-04 | Canon Kabushiki Kaisha | Document processing apparatus, document processing method, and document processing program |
US20050180648A1 (en) * | 2004-02-12 | 2005-08-18 | Xerox Corporation | Systems and methods for adjusting image data to form highly compressible image planes |
US20050190981A1 (en) * | 2004-02-26 | 2005-09-01 | Xerox Corporation | System for recording image data from a set of sheets having similar graphic elements |
US20060072830A1 (en) * | 2004-02-26 | 2006-04-06 | Xerox Corporation | Method for automated image indexing and retrieval |
US7292710B2 (en) * | 2004-02-26 | 2007-11-06 | Xerox Corporation | System for recording image data from a set of sheets having similar graphic elements |
US7324711B2 (en) * | 2004-02-26 | 2008-01-29 | Xerox Corporation | Method for automated image indexing and retrieval |
US20080055669A1 (en) * | 2004-02-26 | 2008-03-06 | Xerox Corporation | Method for automated image indexing and retrieval |
US7813595B2 (en) * | 2004-02-26 | 2010-10-12 | Xerox Corporation | Method for automated image indexing and retrieval |
US20060045357A1 (en) * | 2004-08-25 | 2006-03-02 | Schwartz Edward L | Multi-resolution segmentation and fill |
US20070065040A1 (en) * | 2005-09-22 | 2007-03-22 | Konica Minolta Systems Laboratory, Inc. | Photo image matching method and apparatus |
Cited By (23)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20060204111A1 (en) * | 2005-03-08 | 2006-09-14 | Fuji Xerox Co., Ltd. | Translated document image production device, recording medium and translated document image production method |
US7606419B2 (en) * | 2005-03-08 | 2009-10-20 | Fuji Xerox Co., Ltd. | Translated document image production device, recording medium and translated document image production method |
US20080074685A1 (en) * | 2006-09-25 | 2008-03-27 | Konica Minolta Business Technologies, Inc. | Image processing apparatus, image processing method, and computer readable recording medium stored with image processing program |
US20090002762A1 (en) * | 2007-06-28 | 2009-01-01 | Konica Minolta Business Technologies,Inc. | Image processing apparatus, computer readable recording medium stored with image processing program, and image processing method |
US20090316217A1 (en) * | 2008-06-18 | 2009-12-24 | Canon Kabushiki Kaisha | Image processing device, image processing method and computer readable medium |
US8395819B2 (en) | 2008-06-18 | 2013-03-12 | Canon Kabushiki Kaisha | Image processing device, image processing method and computer readable medium |
US8370373B2 (en) * | 2008-10-01 | 2013-02-05 | Canon Kabushiki Kaisha | Document processing system and control method thereof, program, and storage medium |
US20100079781A1 (en) * | 2008-10-01 | 2010-04-01 | Canon Kabushiki Kaisha | Document processing system and control method thereof, program, and storage medium |
US9026564B2 (en) * | 2008-10-01 | 2015-05-05 | Canon Kabsuhiki Kaisha | Document processing system and control method thereof, program, and storage medium |
US20130124582A1 (en) * | 2008-10-01 | 2013-05-16 | Canon Kabushiki Kaisha | Document processing system and control method thereof, program, and storage medium |
US20100082709A1 (en) * | 2008-10-01 | 2010-04-01 | Canon Kabushiki Kaisha | Document processing system and control method thereof, program, and storage medium |
US8780131B2 (en) * | 2008-12-19 | 2014-07-15 | Xerox Corporation | Systems and methods for text-based personalization of images |
US20100156919A1 (en) * | 2008-12-19 | 2010-06-24 | Xerox Corporation | Systems and methods for text-based personalization of images |
US20110040735A1 (en) * | 2009-08-13 | 2011-02-17 | Hon Hai Precision Industry Co., Ltd. | System and method for compressing files |
US8405873B2 (en) * | 2009-09-30 | 2013-03-26 | Kyocera Mita Corporation | Image processing apparatus and image forming apparatus using same |
US20110075170A1 (en) * | 2009-09-30 | 2011-03-31 | Kyocera Mita Corporation | Image processing apparatus and image forming apparatus using same |
US20130242160A1 (en) * | 2012-03-14 | 2013-09-19 | Casio Computer Co., Ltd. | Image processing apparatus capable of specifying positions on screen |
US9402029B2 (en) * | 2012-03-14 | 2016-07-26 | Casio Computer Co., Ltd. | Image processing apparatus capable of specifying positions on screen |
US20150379750A1 (en) * | 2013-03-29 | 2015-12-31 | Rakuten ,Inc. | Image processing device, image processing method, information storage medium, and program |
US9905030B2 (en) * | 2013-03-29 | 2018-02-27 | Rakuten, Inc | Image processing device, image processing method, information storage medium, and program |
CN104243768A (en) * | 2013-06-19 | 2014-12-24 | 夏普株式会社 | IMAGE PROCESSING APPARATUS and IMAGE FORMING APPARATUS |
US9497486B2 (en) | 2013-06-19 | 2016-11-15 | Sharp Kabushiki Kaisha | Image processing apparatus, image forming apparatus and recording medium |
US20160217117A1 (en) * | 2015-01-27 | 2016-07-28 | Abbyy Development Llc | Smart eraser |
Also Published As
Publication number | Publication date |
---|---|
US8319987B2 (en) | 2012-11-27 |
JP4817821B2 (en) | 2011-11-16 |
JP2007158510A (en) | 2007-06-21 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US8319987B2 (en) | Image processing apparatus and control method for compressing image data by determining common images amongst a plurality of page images | |
US7343046B2 (en) | Systems and methods for organizing image data into regions | |
US8218887B2 (en) | Enhanced method of multilayer compression of PDF (image) files using OCR systems | |
US7593961B2 (en) | Information processing apparatus for retrieving image data similar to an entered image | |
JP5302768B2 (en) | Image processing apparatus and image processing method | |
US7386166B2 (en) | Systems and methods for connecting regions image data having similar characteristics | |
JP4667062B2 (en) | Image analysis apparatus, image analysis method, and blob identification apparatus | |
US8503036B2 (en) | System and method of improving image quality in digital image scanning and printing by reducing noise in output image data | |
US20100246951A1 (en) | Colour correcting foreground colours for visual quality improvement | |
US10477063B2 (en) | Character detection and binarization | |
JPH05303632A (en) | Method and device for identifying similar color area of spot color image | |
CN1458791A (en) | Sectioned layered image system | |
WO2007078596A1 (en) | Compressing images in documents | |
JP2008028717A (en) | Image processor and method and program | |
JP2000306103A (en) | Method and device for information processing | |
US7065254B2 (en) | Multilayered image file | |
JP6923037B2 (en) | Image processing equipment, image processing methods and programs | |
US8270722B2 (en) | Image processing with preferential vectorization of character and graphic regions | |
JP2003219187A (en) | Image processing method and image processor | |
JP2007005907A (en) | Image processing method, image processor, image processing program, and recording medium | |
JP4759502B2 (en) | Image processing method, image processing apparatus, image processing program, and computer-readable recording medium recording the image processing program | |
JP2010028819A (en) | Image processing apparatus and method, and image reading apparatus | |
JP5884509B2 (en) | Image processing apparatus, image reading apparatus, and program | |
JP4719924B2 (en) | Image processing apparatus and image processing method | |
JP4584805B2 (en) | Image processing apparatus, image processing method, and image processing program |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment |
Owner name: CANON KABUSHIKI KAISHA, JAPAN Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:MAEKAWA, KOJI;REEL/FRAME:018560/0993 Effective date: 20061127 |
|
STCF | Information on status: patent grant |
Free format text: PATENTED CASE |
|
FPAY | Fee payment |
Year of fee payment: 4 |
|
FEPP | Fee payment procedure |
Free format text: MAINTENANCE FEE REMINDER MAILED (ORIGINAL EVENT CODE: REM.); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY |
|
LAPS | Lapse for failure to pay maintenance fees |
Free format text: PATENT EXPIRED FOR FAILURE TO PAY MAINTENANCE FEES (ORIGINAL EVENT CODE: EXP.); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY |
|
STCH | Information on status: patent discontinuation |
Free format text: PATENT EXPIRED DUE TO NONPAYMENT OF MAINTENANCE FEES UNDER 37 CFR 1.362 |
|
FP | Lapsed due to failure to pay maintenance fee |
Effective date: 20201127 |