US20140380209A1 - Method for operating portable devices having a touch screen - Google Patents
Method for operating portable devices having a touch screen Download PDFInfo
- Publication number
- US20140380209A1 US20140380209A1 US14/294,729 US201414294729A US2014380209A1 US 20140380209 A1 US20140380209 A1 US 20140380209A1 US 201414294729 A US201414294729 A US 201414294729A US 2014380209 A1 US2014380209 A1 US 2014380209A1
- Authority
- US
- United States
- Prior art keywords
- screen
- shifting
- touch display
- coordinates
- full screen
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Abandoned
Links
- 238000000034 method Methods 0.000 title claims description 21
- 210000003813 thumb Anatomy 0.000 claims abstract description 33
- 230000004044 response Effects 0.000 claims abstract description 8
- 230000008859 change Effects 0.000 claims description 7
- 238000004590 computer program Methods 0.000 claims 1
- 210000003811 finger Anatomy 0.000 description 28
- 238000006243 chemical reaction Methods 0.000 description 19
- 230000001133 acceleration Effects 0.000 description 17
- 238000010586 diagram Methods 0.000 description 8
- 238000013459 approach Methods 0.000 description 4
- 230000005484 gravity Effects 0.000 description 4
- 210000004936 left thumb Anatomy 0.000 description 4
- 210000004935 right thumb Anatomy 0.000 description 4
- 230000009471 action Effects 0.000 description 3
- 230000008901 benefit Effects 0.000 description 3
- 238000001514 detection method Methods 0.000 description 3
- 230000003247 decreasing effect Effects 0.000 description 2
- 230000006870 function Effects 0.000 description 2
- 230000002093 peripheral effect Effects 0.000 description 2
- 239000004065 semiconductor Substances 0.000 description 2
- 238000005452 bending Methods 0.000 description 1
- 239000004973 liquid crystal related substance Substances 0.000 description 1
- 239000013589 supplement Substances 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0484—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0481—Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
- G06F3/04817—Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance using icons
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0488—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
Definitions
- the present invention relates to portable devices in general, and particularly to a method for improving one-handed operability of a portable information terminal having a touch screen.
- a smartphone or a tablet is typically operated by touching an object, such as an icon, a character or a symbol, displayed on a touch screen with a finger.
- an object such as an icon, a character or a symbol
- a touch panel for detecting the approach or touch of the finger is used.
- a portable information terminal such as the smartphone or the tablet terminal
- a portable information terminal is easy to carry, it is characterized in that a user can operate the portable information terminal with one hand while holding it in the hand.
- an operation method for operating a portable information terminal with one hand while holding it in the same hand is called one-handed operation.
- FIG. 8 shows a state when one-handed operation of a smartphone is performed with a left thumb.
- the thumb capable of operating the smartphone while holding the housing stably is most suitable for one-handed operation.
- a screen displayed on the touch screen includes objects as targets of touch operations over the screen. Therefore, areas that cannot be operated with the thumb to perform one-handed operation exist on the touch screen. Then, the areas that cannot be operated with the thumb are expanded as the size of the touch screen becomes larger. If the holding style of the housing is changed by one hand alone to operate the smartphone with a thumb or another finger while keeping the housing in the unstable attitude, there will be a danger of falling the smartphone.
- the display screen can be scrolled to display a hidden image.
- one-handed operation can be achieved if an object desired to input can be scrolled to come within the reach of the thumb.
- a range beyond the reach of the thumb remains on a screen with an upper limit displayed to indicate that the screen can no longer be scrolled upward.
- scrolling cannot be applied to a case where a screen that cannot be scrolled is displayed. Further, scrolling is done with a swipe or a flick of a finger, but this is a troublesome operation method to perform input to an object, because it is not easy for an unskilled person to do with one-handed operation.
- a full screen having multiple graphical objects is initially presented on a touch display of a portable device.
- the touch display includes a comfortable operation area that is within the reach of a thumb of a hand holding the portable device, an inoperable area that is beyond the reach of the thumb of the hand holding the portable device, and a difficult operation area that is located between the comfortable operation area and the hand holding the portable device.
- the full screen is shifted in a direction of a palm of the hand holding the portable device to present a portion of the full screen on the touch display.
- the full screen presentation is restored on the touch display.
- FIG. 1 shows the three different operation areas on a smartphone during a one-handed operation
- FIG. 2 is a block diagram of a smartphone
- FIG. 3 is a block diagram of the software that constitutes an input system of the smartphone from FIG. 2 ;
- FIGS. 4A-4C are diagrams depicting a screen shifting operation
- FIG. 5 is a block diagram of the hardware configuration of the input system
- FIG. 6 is a flowchart of a method for performing a screen shifting operation
- FIG. 7 shows a screen shifting operation being performed on a window screen
- FIG. 8 shows a state when one-handed operation is performed with a left thumb on a smartphone.
- FIG. 1 shows a state in which one-handed operation of a smartphone 100 as an example of a portable information terminal is performed with a right thumb.
- a touch panel with a thumb In normal one-handed operation on a touch screen 101 , it is a common practice to operate a touch panel with a thumb while holding the smartphone 100 in a right hand or left hand with a lower corner fitted in the palm.
- a range capable of operating the touch panel with the thumb comfortably without changing the holding position after holding it once is a roughly arc-like range with the length of the thumb set to a radius around the base of the thumb.
- This area on the touch screen 101 is called a comfortable operation area 205 .
- the comfortable operation area 205 is surrounded by an outer boundary 201 far from the hand and an inner boundary 203 close to the hand. An area closer than the inner boundary 203 can be operated by bending the thumb without switching the smartphone 100 to the other hand, but is more difficult to operate than the comfortable operation area 205 . Therefore, this area is called a difficult operation area 207 . An area farther than the outer boundary 201 cannot be operated by the right thumb unless the smartphone 100 is switched to the other hand. Therefore, this area is called an inoperable area 209 .
- the comfortable operation area 205 corresponds to an area where the thumb is stretched naturally and hence easiest to perform an operation.
- FIG. 2 is a functional block diagram of the smartphone 100 .
- the smartphone 100 includes many functional devices such as a camera, audio, and radio, only functional devices necessary to describe or understand the present invention are shown in FIG. 2 . Some of these functions can be integrated into one semiconductor chip or divided into individual semiconductor chips.
- a CPU 109 , a display 103 , and a main memory 113 are connected to an I/O controller 111 .
- the I/O controller 111 provides an interface function for controlling mutual data transfer among many peripheral devices, the CPU 109 , and the main memory 113 , where the peripheral devices include the display 103 .
- a liquid crystal display (LCD) is employed as the display 103 , but any other type of flat display panel such as organic EL can also be adopted.
- An in-cell touch panel formed with a transparent conductive film is provided in the display 103 as a touch panel 105 .
- the touch panel 105 may be formed with transparent electrodes as a separate member and overlapped on the display 103 .
- a projected capacitive type or surface capacitive type that outputs the coordinates of a position at which a finger has touched on or has approached the surface, a resistive film type that outputs the coordinates of a pressed position, or any other type can be employed.
- the projected capacitive type is employed.
- a complex made up by combining the touch panel 105 and the display 103 constitutes the touch screen 101 .
- the touch panel 105 is connected to a touch panel controller 115 .
- One or more pressure sensors 107 are placed in positions capable of detecting a pressing force exerted by a finger on the touch screen 101 .
- the pressure sensor 107 may be placed below the touch screen 101 , or on the back of a housing of the smartphone 100 .
- the pressure sensor 107 is connected to the touch panel controller 115 .
- the pressure sensor 107 cooperates with the touch panel 105 to generate an operation event for performing a screen shifting operation to be described later.
- the main memory 113 is a volatile memory for storing programs executed by the CPU 109 .
- the touch panel controller 115 converts a coordinate signal received from the touch panel 105 and a pressure signal received from the pressure sensor 107 into predetermined protocol data recognizable by a program, and outputs the predetermined protocol data to the system.
- a flash memory 117 is a nonvolatile memory for storing an OS and applications executed by the CPU 109 , and data.
- a program ( FIGS. 4A-4C ) for performing a screen shifting operation of the present invention is also stored in the flash memory 117 .
- An acceleration sensor 119 detects gravity acceleration generated in the housing of the smartphone 100 and the acceleration of vibration, and outputs acceleration data to the program.
- the acceleration sensor 119 has three detection axes (X axis, Y axis, and Z axis) that run at right angles to one another, and each detection axis detects and outputs a force component of the gravity acceleration and acceleration caused by impact applied to the housing of the smartphone 100 .
- the OS calculates a tilt angle with respect to the gravity direction of each detection axis from the force component of the gravity acceleration received from the acceleration sensor 119 , determines the longitudinal and lateral directions of the housing, and changes the display direction of a screen to be displayed on the display 103 to match the viewing direction of a user.
- FIG. 3 is a block diagram for describing a state when programs stored in the flash memory 117 are executed by the CPU 109 .
- a document application 155 and a browsing application 157 are shown in FIG. 3 as examples of applications.
- the document application 155 is executed to create a document
- the browsing application 157 is executed to access the Internet.
- a screen shifting application 159 provides a user interface for registering the outer boundary 201 and the inner boundary 203 to perform a screen shifting operation to be described below.
- the screen shifting operation program 153 cooperates with the OS 151 to perform processing for the screen shifting operation.
- the screen shifting operation program 153 is placed between an application layer and a layer of the OS 151 . Therefore, there is no need to alter the document application 155 and the browsing application 157 for the screen shifting operation or to add special code thereto.
- FIGS. 4A-4C are diagrams for describing an outline when the screen shifting operation is performed on the smartphone 100 with one-handed operation using the right hand as shown in FIG. 1 .
- the description will be made by taking the document application 155 as an example.
- a home screen 181 composed of multiple icons including an icon 155 a for the document application 155 is displayed on the touch screen 101 .
- the home screen 181 is initially displayed when all applications are started, which is also called a standby screen.
- the home screen is called a desktop screen in the case of a laptop PC.
- a system area 182 for indicating system information In addition to a client area 184 for displaying the home screen 181 , a system area 182 for indicating system information, such as the radio wave state, the time, and the charging state, also appears on the touch screen 101 .
- a user can display and operate an application screen in the client area 184 , but cannot access the system area 182 .
- a screen made up of the client area 184 and the system area 182 and displayed on the touch screen 101 is called the entire screen.
- the smartphone 100 is configured to display one application screen in the client area 184 on the touch screen 101 .
- the size (pixel count) of one screen is larger than the resolution of the display 103 , there is a part hidden from the client area 184 , but the hidden part can be displayed by scrolling action.
- the scrolling action can be achieved with a swipe or a flick on the touch screen 101 .
- the icon 155 a is displayed in the comfortable operation area 205 , one-handed operation is enabled with a tap of the thumb. However, even when the icon 155 a is displayed in the inoperable area 209 , one-handed operation is enabled by a method to be described later.
- a tap on the icon 155 a is done, the application 155 is started, and an application screen 155 b being edited is displayed in the client area 184 in a full-screen format as shown in FIG. 4B .
- the display position of the entire screen in FIG. 4B is called a standard position.
- the display in the full-screen format means that one application screen is displayed over the entire client area 184 , which is distinguished from a case where multiple application screens are displayed in a window format.
- the size of an image to be displayed in the full-screen format can be larger than the resolution of the touch screen 101 .
- an area hidden from the screen can be displayed by scrolling action.
- the display position of a screen displayed in the full-screen format cannot be changed on the home screen 181 .
- the application screen 155 b includes a software keyboard 251 .
- the software keyboard 251 may be part of the application 155 or be provided by the OS 151 .
- the OS 151 that detected the start of an application requiring keyboard input displays the software keyboard 251 in a position indicated by the application in the client area 184 to be superimposed on the application screen 155 b.
- a cursor 155 c indicative of an input position is displayed at the end of a sentence.
- the software keyboard 251 also includes a target object 253 corresponding to a phone key displayed in the inoperable area 209 .
- the target object 253 denotes an icon, a character, or a symbol to which the system responds with a tap or a flick on the touch screen 101 within the home screen 181 or the application screen 151 .
- a reference position 255 of the touch screen 101 is defined at a corner of the touch screen 101 close to the hand holding the smartphone 100 .
- the touch panel operations are operations for causing the touch panel 105 to detect the coordinates of a touch or approach of a finger to the touch screen 101 with pressure to such an extent that does not exceed a lower limit value of the pressure sensor 107 .
- the touch panel operations include multiple gestures, such as a touch of a finger on the touch screen 101 , a tap to release the finger soon after the touch, a swipe to move the finger while touching, and a flick to move the touching finger quickly.
- Tap gestures also include a long tap that is a gesture to take a long time until the finger is released.
- the system is also adapted to multi-touch operation, enabling input by a gesture for an operation using two or more fingers at the same time.
- the pressing operation is an operation for causing the pressure sensor 107 to detect pressure of a lower threshold value or larger. In the pressing operation, a gesture for detecting the coordinates of a finger that touches the touch screen 101 also takes place, but the system can detect a pressing force to distinguish the gesture from gestures for the touch panel operations.
- the entire screen is shifted toward the pressed position 259 along the virtual shifting straight line 257 while maintaining screen consistency without changing the screen size, the shape of a content, and the arrangement of each of the application screen 155 b and the software keyboard 251 .
- a blank screen 261 is displayed on the left edge and the upper edge of the touch screen 101 , and at the same time, the application screen 155 b displayed in FIG. 4B runs off the lower right edges.
- the blank screen 261 is a screen displayed in an area of the touch screen 101 , where there exists no image data required by an application processing section 309 , and the display 103 displays the screen in color according to the normally white or normally black characteristic.
- an image data generating section 301 FIG. 5
- the entire screen continues to be shifted during the pressing operation, and the target object 253 eventually reaches the pressed position 259 .
- the system that detected the release recognizes that there is input to the coordinates. At this time, the system acquires the input coordinates from the touch panel 105 . After the input is confirmed, the system returns the screen to the state in FIG. 4B and waits for the next input.
- the input operation after shifting the entire screen to make the target object 253 enter the comfortable operation area 205 this way is called a screen shifting operation.
- the screen shifting operation is a manipulation technique for easily achieving one-handed operation.
- the screen shifting operation is achieved by cooperation between a touch panel operation and the pressing operation.
- the pressing operation serves to allow the system to recognize that the coordinates detected by the touch panel 101 is accompanied with the screen shifting operation.
- the screen shifting operation is performed, not only does part of the application screen displayed in the standard position runs off, but also a blank screen appears.
- the entire screen may be defined as an application screen displayed in the client area 184 . In this case, only the application screen 155 b displayed in the client area 184 is shifted with the screen shifting operation without shifting the screen of the system area 182 .
- FIG. 5 is a functional block diagram showing the configuration of an input system 300 that supports the screen shifting operation.
- the input system 300 is configured of the hardware resources shown in FIG. 2 and software resources shown in FIG. 3 .
- the image data generating section 301 , a coordinate conversion section 303 , a shifting direction determining section 311 , and an input processing section 307 can be implemented mainly by cooperation between the OS 151 and the screen shifting operation program 153 , and hardware resources such as the CPU 109 executing these software resources and the main memory 113 .
- the application processing section 309 is implemented mainly by cooperation between software resources, such as the document application 155 , the browsing application 157 , the screen shifting application 159 , and the OS 151 , and hardware resources for executing these software resources.
- software resources such as the document application 155 , the browsing application 157 , the screen shifting application 159 , and the OS 151 , and hardware resources for executing these software resources.
- An application developer can create code without considering the screen shifting operation at all.
- the input processing section 307 receives coordinate data and pressure data from the touch panel controller 115 , and receives acceleration data from the acceleration sensor 119 .
- the input processing section 307 determines a touch panel operation, and sends the application processing section 309 the coordinate data and the acceleration data received.
- the input processing section 307 determines the pressing operation, sends the shifting direction determining section 311 the pressure data, the coordinate data, and the acceleration data received.
- the input processing section 307 sends the coordinates of the pressed position 259 to the coordinate conversion section 303 .
- the shifting direction determining section 311 registers data for defining the outer boundary 201 and the inner boundary 203 to identify the comfortable operation area 205 , and coordinate data on the reference position 255 .
- the shifting direction determining section 311 calculates formulas of the virtual shifting straight line 257 and a shifting straight line 258 .
- the shifting direction determining section 311 registers whether the hand holding the smartphone 100 when a special operation is performed is the right hand or the left hand.
- the coordinate conversion section 303 calculates a shifting vector from the formula of the shifting straight line 258 and the pressure data, calculates the coordinates of a reference position 186 of the entire screen when being displayed on the touch screen 101 , and sends the calculation results to the image data generating section 301 .
- the coordinate conversion section 303 converts coordinate data on the pressed position 259 received from the input processing section 307 into coordinate data on the standard position, and sends it to the application processing section 309 .
- the application processing section 309 receives coordinate data on the input position from the input processing section 307 or the coordinate conversion section 303 , and executes the document application 155 or the browsing application 157 .
- the application processing section 309 does not recognize that the display position of the application screen 155 b is changed by the coordinate conversion section 303 .
- the image data generating section 301 Based on an instruction from the application processing section 309 or the coordinate conversion section 303 , the image data generating section 301 generates pixel data to be displayed on the display 103 , and outputs the pixel data to an I/O controller 123 .
- FIG. 6 is a flowchart of a method for the input system 300 to proform the screen shifting operation.
- the screen shifting application 159 is started with a touch panel operation to register, with the shifting direction determining section 311 , the outer boundary 201 or the outer boundary 201 and the inner boundary 203 in FIG. 1 .
- the screen shifting application 159 displays a wizard screen on the display 103 to urge the user to tap several positions on the touch screen 101 with a thumb of the right hand and left hand in order by one-handed operation.
- Coordinate data on the tap positions are sent from the application processing section 309 to the shifting direction determining section 311 .
- the shifting direction determining section 311 creates, from the coordinates received, data indicative of the center of the annular-shaped comfortable operation area 205 approximated by a circular arc.
- the shifting direction determining section 311 defines the outer boundary 201 or the outer boundary 201 and the inner boundary 203 , which are concentric with the center of the circular arc, as circular arcs obtained by increasing/decreasing each radius at a predetermined ratio, and registers the coordinate data.
- Data on the outer boundary 201 and the inner boundary 203 may be generated directly from the coordinates of the ball of the thumb that touches the screen upon swiping with the thumb. Further, the shifting direction determining section 311 registers the coordinates of the reference position 255 ( FIG. 4 ) on the touch screen 101 .
- the coordinates of the reference position 255 can be the coordinates of the lower right corner of the touch screen 101 in the case of one-handed operation with the right hand or the coordinates of the lower left corner in the case of one-handed operation with the left hand.
- the coordinates of the reference position 255 can be the central coordinates of a circular arc when the outer boundary 201 and the inner boundary 203 are approximated by the circular arc.
- the central coordinates of the circular arc becomes a position close to the base of the thumb.
- the central coordinates of the circular arc may be located outside of the touch screen 101 .
- the user performs a special operation to inform the system whether the hand holding the smartphone 100 at present is the right hand or the left hand.
- the special operation is not particularly limited as long as it can be distinguished from a touch panel operation for an object, but it is desired that the special operation can be performed in a state of continuing the one-handed operation without switching the smartphone 100 to the other hand.
- the special operation can be a gesture of swiping each comfortable operation area 205 with the right thumb or left thumb while pressing the thumb.
- the special operation can be an operation for characteristically shaking the smartphone once or a few times while touching each comfortable operation area with the right thumb or left thumb to cause the acceleration sensor 119 to generate an acceleration signal.
- the input processing section 307 sends the shifting direction determining section 311 pressure data, coordinate data, and acceleration data when the special operation is performed. From the coordinate data, the coordinate data, or the acceleration data received, the shifting direction determining section 311 recognizes and registers whether the hand holding the smartphone at present is the right hand or the left hand.
- the icon 155 a is tapped on the home screen 181 in FIG. 4A to start the document application 155 a .
- a target object displayed in the comfortable operation area 205 can be operated with a touch panel operation.
- a target object displayed in the difficult operation area 207 can also be operated with a touch panel operation.
- input to the icon 155 a can also be performed by the screen shifting operation.
- the tap operation is performed on the icon 155 a displayed in the comfortable operation area 205
- the application screen 155 is displayed on the display 103 in the full-screen format as shown in FIG. 4B .
- the OS 151 defines coordinates (0, 0) at the upper left corner of the touch screen 101 .
- the OS 151 defines the reference position 186 of the entire screen displayed on the touch screen 101 .
- the reference position 186 of the entire screen is defined at the upper left corner of the system area 182 .
- the application 155 requests the OS 151 to display the application screen 155 b
- the OS 151 displays the application screen 155 b in the client area 184 in the full-screen format.
- the display position of the entire screen on the touch screen 101 at this time is called a standard position.
- the screen shifting operation for the target object 253 that configures the application screen 155 b displayed in the inoperable area 209 is started.
- the cursor 155 c becomes the target object.
- the screen shifting operation can be performed after the target object 253 is scrolled and displayed in the inoperable area 209 .
- the user visually assumes the virtual shifting straight line 257 that connects between the reference position 255 set at the corner of the touch screen 101 and the target object 253 , and presses a position thereon with the thumb.
- the pressed position 259 naturally comes to the range of the comfortable operation area 205 .
- the tip of the thumb only has to be directed naturally to the target object 253 .
- the inner boundary 203 is also defined for the operation difficult area 207 .
- a position close to the outer boundary 201 in the comfortable operation area 205 is pressed.
- the entire screen is so shifted that the difficult operation area 207 will approach the pressed position 259 .
- the input processing section 307 sends the shifting direction determining section 311 the coordinate data and the pressure data until input is confirmed.
- the shifting direction determining section 311 calculates the formula of the virtual shifting straight line 257 that connects between the coordinates of the reference position 255 on the touch screen 101 and the coordinates of the pressed position 259 . Further, the shifting direction determining section 311 creates a formula of the shifting straight line 258 passing through the coordinates (0, 0) of the touch screen 101 that matches the reference position 186 of the entire screen and parallel with the virtual shifting straight line 257 , and sends the formula to the coordinate conversion section 303 . In block 411 , the coordinate conversion section 303 calculates a shifting vector from the formula of the shifting straight line 258 and the pressure data. The shifting vector is coordinate data on the reference position 186 of the entire screen to be shifted.
- a lower limit value and an upper limit value are set for the pressure data received from the pressure sensor 107 to calculate a position vector with the coordinates of the reference position 186 of the entire screen assigned between the lower limit value and the upper limit value.
- the reference position 186 immediately before the pressure data exceeds the lower limit value is set to the coordinates (0, 0) of the touch screen 101
- the coordinates of the reference position 186 when the pressure data reaches the upper limit value is set to an intersection between the shifting straight line 258 and the coordinates of the right end of the touch screen 101 .
- the coordinates of the reference position 186 during the screen shifting operation is either of the positions on the shifting straight line 258 in proportion to the pressing force.
- the target object 253 is bound to pass through the pressed position 259 until the pressure data reaches the upper limit value.
- a velocity vector corresponding to a change in the pressing force is calculated.
- the shifting velocity is made to correspond to a time differential value of the pressing force.
- the coordinates of the reference position 186 of the entire screen during the screen shifting operation is changed in such a manner that the application screen 155 b is shifted in the lower right direction when the pressure is increased, the shifting is stopped during no change in pressing force, or the application screen 155 b is shifted in a returning direction when the pressure is decreased. Then, the shifting velocity can be made proportional to the time differential value of the pressing force.
- the coordinate conversion section 303 sends the image data generating section 301 the coordinates of the reference position 186 of the entire screen continuously every predetermined time.
- the image data generating section 313 updates the image data to make the reference position 186 math designated coordinates, and displays the entire screen in the shifted position while maintaining screen consistency.
- the application screen 155 b is displayed in a position shifted in a lower right direction of the touch screen 101 along with the shifting of the entire screen.
- the touch screen 101 displays the blank screen 261 on the left edge and the upper edge, and the target object 253 approaches the pressed position 259 while making the display of the application screen 155 b run off the lower right edges.
- the user may change the pressed position 259 to correct the shifting direction because the first pressed position 259 is not appropriate.
- the procedure returns to block 409 .
- the input processing section 307 sends the shifting direction determining section 311 the coordinate data on the finger and the pressure data after the pressed position is changed.
- the shifting direction determining section 311 recalculates the virtual shifting straight line 257 and the shifting straight line 258 .
- the user who visually determines that the target object 253 reaches the pressed position 259 releases the finger quickly from the touch screen 101 .
- the reason for releasing the finger quickly is because, when the finger is released slowly after starting the screen shifting operation, the position vector or the velocity vector is recalculated without confirming the input so that the entire screen can be returned to the standard position. Therefore, it is not necessary to limit the operation for confirming input to the quick release of the finger.
- the input processing section 307 determines that input is performed by the screen shifting operation, and sends the coordinates of the pressed position 259 to the coordinate conversion section 303 .
- the coordinate conversion section 303 calculates the shifting amount and shifting direction of the reference position 186 of the entire screen after the start of the screen shifting operation until the input is confirmed.
- the coordinate conversion section 303 converts the coordinates of the pressed position 259 shown in FIG. 4C into the coordinates of the target object 253 on the application screen 155 b displayed in the standard position of FIG. 4B , and sends the coordinates of the target object 253 to the application processing section 309 .
- the application processing section 309 recognizes that the application screen 155 b is always displayed in the standard position, and performs processing in response to the fact that the input operation is performed on the target object 253 .
- the coordinate conversion section 303 After sending the coordinates of the target object 253 , the coordinate conversion section 303 requests the image data generating section 301 in block 423 to display the coordinates of the reference position 186 of the entire screen to match the coordinates (0, 0) of the touch screen 101 , and waits for the next input.
- the shifting direction determining section 311 calculates the shifting direction to shift the application screen 155 b in an upper left direction.
- the display position of the entire screen is returned to the standard position in block 423 . This method is convenient when the next target object is displayed in the comfortable operation area in the standard position.
- target objects 253 and 254 are displayed together in the comfortable operation area 205 with one screen shifting operation.
- the coordinate conversion section 303 can stop returning the entire screen to the standard position to perform a touch panel operation continuously on the shifted application screen 155 b.
- the coordinate conversion section 303 when receiving an event of a quick release of the finger after the entire screen is shifted to a predetermined position, the coordinate conversion section 303 fixes the display position of the entire screen to the coordinates at the time. Then, when the target objects 253 and 254 the pressing of which is stopped are tapped in order, the input processing section sends coordinate data to the application processing section 309 . Further, when pressing is restarted, the shifting direction determining section 311 calculates a new virtual shifting straight line 257 . Then, when the coordinate conversion section 303 shifts the entire screen again and receives an event of the quick release of the finger, the shifting of the entire screen is stopped again. Then, when receiving the event of releasing the finger slowly during the pressing operation, the coordinate conversion section 303 returns the display position of the entire screen to the standard position.
- FIG. 7 shows a state where the screen shifting operation is performed when an application screen 157 b of the browsing application 157 is displayed in a window format.
- an application screen displayed in the foreground becomes the target of the screen shifting operation, and the screens and the home screen 181 displayed in the background are not shifted.
- the application screen 157 b is shifted by the screen shifting operation within a range of the client area 184 , and the display runs off the edges.
- all characters, images, and icons with hyperlinks embedded therein can be set as target objects to perform input with the screen shifting operation.
- the application screen 157 b may be returned once to the position before the start of the screen shifting operation, or may be tapped after shifting is stopped to perform input to the next target object continuously.
- the input system 300 can perform the screen shifting operation with a pressing operation and a touch panel operation on the surface of the display 103 , one-handed operation can be easily performed while maintaining stable holding.
- the present invention can be realized without using the pressure sensor. For example, when the touch screen 101 is pressed by a predetermined pressing force of a finger, the area of the touch of the finger can be calculated from the coordinates detected by the touch panel 105 to generate an event for performing the screen shifting operation.
- a special gesture can be defined for a touch panel operation to enable the screen shifting operation after the input system 300 enters a screen shifting operation mode.
- the shifting direction determining section 311 and the coordinate conversion section 303 are so configured that, while the input system 300 is in the screen shifting operation mode, an application screen is shifted with a swipe of a finger, and when the finger is released, the display position of the screen is fixed at the position.
- the swipe creates a blank screen and causes the application screen to run off the edges, the user can perform input with a touch panel operation after shifting the entire screen or a window screen to a convenient position.
- the present disclosure provides a method for improving one-handed operability of a portable information terminal having a touch screen.
Landscapes
- Engineering & Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Human Computer Interaction (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- User Interface Of Digital Computer (AREA)
- Position Input By Displaying (AREA)
Abstract
Description
- The present application claims benefit of priority under 35 U.S.C. §§120, 365 to the previously filed Japanese Patent Application No. JP2013-130513 with a priority date of Jun. 21, 2013, which is incorporated by reference herein.
- 1. Technical Field
- The present invention relates to portable devices in general, and particularly to a method for improving one-handed operability of a portable information terminal having a touch screen.
- 2. Description of Related Art
- A smartphone or a tablet is typically operated by touching an object, such as an icon, a character or a symbol, displayed on a touch screen with a finger. In order to detect the coordinates of a touch, a touch panel for detecting the approach or touch of the finger is used. There is also an input system using a pressure sensor for detecting a pressing force exerted on the touch screen to supplement input from the touch panel.
- In recent years, there has been a growing trend to increase the size of a touch screen for a smartphone. Further, a tablet terminal with a small touch screen mounted thereon has emerged. Since a portable information terminal, such as the smartphone or the tablet terminal, is easy to carry, it is characterized in that a user can operate the portable information terminal with one hand while holding it in the hand. For the present specification, an operation method for operating a portable information terminal with one hand while holding it in the same hand is called one-handed operation.
-
FIG. 8 shows a state when one-handed operation of a smartphone is performed with a left thumb. The thumb capable of operating the smartphone while holding the housing stably is most suitable for one-handed operation. A screen displayed on the touch screen includes objects as targets of touch operations over the screen. Therefore, areas that cannot be operated with the thumb to perform one-handed operation exist on the touch screen. Then, the areas that cannot be operated with the thumb are expanded as the size of the touch screen becomes larger. If the holding style of the housing is changed by one hand alone to operate the smartphone with a thumb or another finger while keeping the housing in the unstable attitude, there will be a danger of falling the smartphone. - For a smartphone, after an icon has been moved to a position easy to operate with a thumb may facilitate one-handed operation. However, since the screen display is changed by moving the icon, the screen may become difficult to view or an applicable screen is limited, and this disables the above-mentioned method from being applied to the screens of popular application programs, such as a browser screen and a text input screen. Since an input screen to be displayed is shifted horizontally toward a hand to perform an operation, limited applications such as a telephone, a keypad, and a calculator pad can be operated with a thumb, though available applications are limited. Even in this case, objects that cannot be operated with the thumb remain on the touch screen.
- Since the size of images is large on the smartphone, when the entire screen cannot be displayed on the touch screen, the display screen can be scrolled to display a hidden image. In this case, one-handed operation can be achieved if an object desired to input can be scrolled to come within the reach of the thumb. However, a range beyond the reach of the thumb remains on a screen with an upper limit displayed to indicate that the screen can no longer be scrolled upward. In addition, scrolling cannot be applied to a case where a screen that cannot be scrolled is displayed. Further, scrolling is done with a swipe or a flick of a finger, but this is a troublesome operation method to perform input to an object, because it is not easy for an unskilled person to do with one-handed operation.
- In accordance with a preferred embodiment of the present invention, a full screen having multiple graphical objects is initially presented on a touch display of a portable device. The touch display includes a comfortable operation area that is within the reach of a thumb of a hand holding the portable device, an inoperable area that is beyond the reach of the thumb of the hand holding the portable device, and a difficult operation area that is located between the comfortable operation area and the hand holding the portable device. In response to a request for a screen shifting operation, the full screen is shifted in a direction of a palm of the hand holding the portable device to present a portion of the full screen on the touch display. After receiving and confirming a user input from the touch screen, the full screen presentation is restored on the touch display.
- All features and advantages of the present disclosure will become apparent in the following detailed written description.
- The disclosure itself, as well as a preferred mode of use, further objects, and advantages thereof, will best be understood by reference to the following detailed description of an illustrative embodiment when read in conjunction with the accompanying drawings, wherein:
-
FIG. 1 shows the three different operation areas on a smartphone during a one-handed operation; -
FIG. 2 is a block diagram of a smartphone; -
FIG. 3 is a block diagram of the software that constitutes an input system of the smartphone fromFIG. 2 ; -
FIGS. 4A-4C are diagrams depicting a screen shifting operation; -
FIG. 5 is a block diagram of the hardware configuration of the input system; -
FIG. 6 is a flowchart of a method for performing a screen shifting operation; -
FIG. 7 shows a screen shifting operation being performed on a window screen; and -
FIG. 8 shows a state when one-handed operation is performed with a left thumb on a smartphone. -
FIG. 1 shows a state in which one-handed operation of asmartphone 100 as an example of a portable information terminal is performed with a right thumb. In normal one-handed operation on atouch screen 101, it is a common practice to operate a touch panel with a thumb while holding thesmartphone 100 in a right hand or left hand with a lower corner fitted in the palm. A range capable of operating the touch panel with the thumb comfortably without changing the holding position after holding it once is a roughly arc-like range with the length of the thumb set to a radius around the base of the thumb. This area on thetouch screen 101 is called acomfortable operation area 205. - The
comfortable operation area 205 is surrounded by anouter boundary 201 far from the hand and aninner boundary 203 close to the hand. An area closer than theinner boundary 203 can be operated by bending the thumb without switching thesmartphone 100 to the other hand, but is more difficult to operate than thecomfortable operation area 205. Therefore, this area is called adifficult operation area 207. An area farther than theouter boundary 201 cannot be operated by the right thumb unless thesmartphone 100 is switched to the other hand. Therefore, this area is called aninoperable area 209. Thecomfortable operation area 205 corresponds to an area where the thumb is stretched naturally and hence easiest to perform an operation. -
FIG. 2 is a functional block diagram of thesmartphone 100. Although thesmartphone 100 includes many functional devices such as a camera, audio, and radio, only functional devices necessary to describe or understand the present invention are shown inFIG. 2 . Some of these functions can be integrated into one semiconductor chip or divided into individual semiconductor chips. ACPU 109, adisplay 103, and amain memory 113 are connected to an I/O controller 111. The I/O controller 111 provides an interface function for controlling mutual data transfer among many peripheral devices, theCPU 109, and themain memory 113, where the peripheral devices include thedisplay 103. - As an example, a liquid crystal display (LCD) is employed as the
display 103, but any other type of flat display panel such as organic EL can also be adopted. An in-cell touch panel formed with a transparent conductive film is provided in thedisplay 103 as atouch panel 105. As another example, thetouch panel 105 may be formed with transparent electrodes as a separate member and overlapped on thedisplay 103. - As the
touch panel 105, a projected capacitive type or surface capacitive type that outputs the coordinates of a position at which a finger has touched on or has approached the surface, a resistive film type that outputs the coordinates of a pressed position, or any other type can be employed. In this embodiment, the projected capacitive type is employed. A complex made up by combining thetouch panel 105 and thedisplay 103 constitutes thetouch screen 101. Thetouch panel 105 is connected to atouch panel controller 115. - One or
more pressure sensors 107 are placed in positions capable of detecting a pressing force exerted by a finger on thetouch screen 101. Thepressure sensor 107 may be placed below thetouch screen 101, or on the back of a housing of thesmartphone 100. Thepressure sensor 107 is connected to thetouch panel controller 115. Thepressure sensor 107 cooperates with thetouch panel 105 to generate an operation event for performing a screen shifting operation to be described later. Themain memory 113 is a volatile memory for storing programs executed by theCPU 109. - The
touch panel controller 115 converts a coordinate signal received from thetouch panel 105 and a pressure signal received from thepressure sensor 107 into predetermined protocol data recognizable by a program, and outputs the predetermined protocol data to the system. Aflash memory 117 is a nonvolatile memory for storing an OS and applications executed by theCPU 109, and data. A program (FIGS. 4A-4C ) for performing a screen shifting operation of the present invention is also stored in theflash memory 117. - An
acceleration sensor 119 detects gravity acceleration generated in the housing of thesmartphone 100 and the acceleration of vibration, and outputs acceleration data to the program. Theacceleration sensor 119 has three detection axes (X axis, Y axis, and Z axis) that run at right angles to one another, and each detection axis detects and outputs a force component of the gravity acceleration and acceleration caused by impact applied to the housing of thesmartphone 100. The OS calculates a tilt angle with respect to the gravity direction of each detection axis from the force component of the gravity acceleration received from theacceleration sensor 119, determines the longitudinal and lateral directions of the housing, and changes the display direction of a screen to be displayed on thedisplay 103 to match the viewing direction of a user. -
FIG. 3 is a block diagram for describing a state when programs stored in theflash memory 117 are executed by theCPU 109. Adocument application 155 and abrowsing application 157 are shown inFIG. 3 as examples of applications. Thedocument application 155 is executed to create a document, and thebrowsing application 157 is executed to access the Internet. - A
screen shifting application 159 provides a user interface for registering theouter boundary 201 and theinner boundary 203 to perform a screen shifting operation to be described below. The screen shiftingoperation program 153 cooperates with theOS 151 to perform processing for the screen shifting operation. As an example, the screen shiftingoperation program 153 is placed between an application layer and a layer of theOS 151. Therefore, there is no need to alter thedocument application 155 and thebrowsing application 157 for the screen shifting operation or to add special code thereto. -
FIGS. 4A-4C are diagrams for describing an outline when the screen shifting operation is performed on thesmartphone 100 with one-handed operation using the right hand as shown inFIG. 1 . The description will be made by taking thedocument application 155 as an example. InFIG. 4A , ahome screen 181 composed of multiple icons including anicon 155 a for thedocument application 155 is displayed on thetouch screen 101. Thehome screen 181 is initially displayed when all applications are started, which is also called a standby screen. The home screen is called a desktop screen in the case of a laptop PC. - In addition to a
client area 184 for displaying thehome screen 181, asystem area 182 for indicating system information, such as the radio wave state, the time, and the charging state, also appears on thetouch screen 101. A user can display and operate an application screen in theclient area 184, but cannot access thesystem area 182. A screen made up of theclient area 184 and thesystem area 182 and displayed on thetouch screen 101 is called the entire screen. Thesmartphone 100 is configured to display one application screen in theclient area 184 on thetouch screen 101. When the size (pixel count) of one screen is larger than the resolution of thedisplay 103, there is a part hidden from theclient area 184, but the hidden part can be displayed by scrolling action. The scrolling action can be achieved with a swipe or a flick on thetouch screen 101. - In this example, since the
icon 155 a is displayed in thecomfortable operation area 205, one-handed operation is enabled with a tap of the thumb. However, even when theicon 155 a is displayed in theinoperable area 209, one-handed operation is enabled by a method to be described later. When a tap on theicon 155 a is done, theapplication 155 is started, and anapplication screen 155 b being edited is displayed in theclient area 184 in a full-screen format as shown inFIG. 4B . The display position of the entire screen inFIG. 4B is called a standard position. - In the present specification, the display in the full-screen format means that one application screen is displayed over the
entire client area 184, which is distinguished from a case where multiple application screens are displayed in a window format. The size of an image to be displayed in the full-screen format can be larger than the resolution of thetouch screen 101. In this case, an area hidden from the screen can be displayed by scrolling action. Unlike a window screen, the display position of a screen displayed in the full-screen format cannot be changed on thehome screen 181. - The
application screen 155 b includes asoftware keyboard 251. Thesoftware keyboard 251 may be part of theapplication 155 or be provided by theOS 151. When it is provided by theOS 151, theOS 151 that detected the start of an application requiring keyboard input displays thesoftware keyboard 251 in a position indicated by the application in theclient area 184 to be superimposed on theapplication screen 155 b. - A
cursor 155 c indicative of an input position is displayed at the end of a sentence. Thesoftware keyboard 251 also includes atarget object 253 corresponding to a phone key displayed in theinoperable area 209. Thetarget object 253 denotes an icon, a character, or a symbol to which the system responds with a tap or a flick on thetouch screen 101 within thehome screen 181 or theapplication screen 151. Areference position 255 of thetouch screen 101 is defined at a corner of thetouch screen 101 close to the hand holding thesmartphone 100. - Here, suppose that the user wants to perform input to the
target object 253 with one-handed operation. The user assumes a virtual shiftingstraight line 257 between thereference position 255 and thetarget object 253 to perform a pressing operation to apressed position 259 on the virtual shiftingstraight line 257 in thecomfortable operation area 205. Here, touch panel operations and the pressing operation will be described. The touch panel operations are operations for causing thetouch panel 105 to detect the coordinates of a touch or approach of a finger to thetouch screen 101 with pressure to such an extent that does not exceed a lower limit value of thepressure sensor 107. - The touch panel operations include multiple gestures, such as a touch of a finger on the
touch screen 101, a tap to release the finger soon after the touch, a swipe to move the finger while touching, and a flick to move the touching finger quickly. Tap gestures also include a long tap that is a gesture to take a long time until the finger is released. The system is also adapted to multi-touch operation, enabling input by a gesture for an operation using two or more fingers at the same time. The pressing operation is an operation for causing thepressure sensor 107 to detect pressure of a lower threshold value or larger. In the pressing operation, a gesture for detecting the coordinates of a finger that touches thetouch screen 101 also takes place, but the system can detect a pressing force to distinguish the gesture from gestures for the touch panel operations. - When the pressing operation is performed, the entire screen is shifted toward the pressed
position 259 along the virtual shiftingstraight line 257 while maintaining screen consistency without changing the screen size, the shape of a content, and the arrangement of each of theapplication screen 155 b and thesoftware keyboard 251. As a result, as shown inFIG. 4C , ablank screen 261 is displayed on the left edge and the upper edge of thetouch screen 101, and at the same time, theapplication screen 155 b displayed inFIG. 4B runs off the lower right edges. - The
blank screen 261 is a screen displayed in an area of thetouch screen 101, where there exists no image data required by anapplication processing section 309, and thedisplay 103 displays the screen in color according to the normally white or normally black characteristic. In this regard, however, an image data generating section 301 (FIG. 5 ) may send special image data to the area of displaying theblank screen 261 to display any background image. The entire screen continues to be shifted during the pressing operation, and thetarget object 253 eventually reaches the pressedposition 259. - When the user who visually confirms that the
target object 253 has reached the pressedposition 259 releases the finger quickly to release the pressure on thepressure sensor 107, the system that detected the release recognizes that there is input to the coordinates. At this time, the system acquires the input coordinates from thetouch panel 105. After the input is confirmed, the system returns the screen to the state inFIG. 4B and waits for the next input. The input operation after shifting the entire screen to make thetarget object 253 enter thecomfortable operation area 205 this way is called a screen shifting operation. - The screen shifting operation is a manipulation technique for easily achieving one-handed operation. The screen shifting operation is achieved by cooperation between a touch panel operation and the pressing operation. At this time, the pressing operation serves to allow the system to recognize that the coordinates detected by the
touch panel 101 is accompanied with the screen shifting operation. When the screen shifting operation is performed, not only does part of the application screen displayed in the standard position runs off, but also a blank screen appears. The entire screen may be defined as an application screen displayed in theclient area 184. In this case, only theapplication screen 155 b displayed in theclient area 184 is shifted with the screen shifting operation without shifting the screen of thesystem area 182. -
FIG. 5 is a functional block diagram showing the configuration of aninput system 300 that supports the screen shifting operation. Theinput system 300 is configured of the hardware resources shown inFIG. 2 and software resources shown inFIG. 3 . The imagedata generating section 301, a coordinateconversion section 303, a shiftingdirection determining section 311, and aninput processing section 307 can be implemented mainly by cooperation between theOS 151 and the screen shiftingoperation program 153, and hardware resources such as theCPU 109 executing these software resources and themain memory 113. - The
application processing section 309 is implemented mainly by cooperation between software resources, such as thedocument application 155, thebrowsing application 157, thescreen shifting application 159, and theOS 151, and hardware resources for executing these software resources. An application developer can create code without considering the screen shifting operation at all. - The
input processing section 307 receives coordinate data and pressure data from thetouch panel controller 115, and receives acceleration data from theacceleration sensor 119. When receiving no pressure data, theinput processing section 307 determines a touch panel operation, and sends theapplication processing section 309 the coordinate data and the acceleration data received. When receiving pressure data, theinput processing section 307 determines the pressing operation, sends the shiftingdirection determining section 311 the pressure data, the coordinate data, and the acceleration data received. When detecting input to thetarget object 253 performed during the screen shifting operation, theinput processing section 307 sends the coordinates of the pressedposition 259 to the coordinateconversion section 303. - The shifting
direction determining section 311 registers data for defining theouter boundary 201 and theinner boundary 203 to identify thecomfortable operation area 205, and coordinate data on thereference position 255. The shiftingdirection determining section 311 calculates formulas of the virtual shiftingstraight line 257 and a shiftingstraight line 258. The shiftingdirection determining section 311 registers whether the hand holding thesmartphone 100 when a special operation is performed is the right hand or the left hand. The coordinateconversion section 303 calculates a shifting vector from the formula of the shiftingstraight line 258 and the pressure data, calculates the coordinates of areference position 186 of the entire screen when being displayed on thetouch screen 101, and sends the calculation results to the imagedata generating section 301. The coordinateconversion section 303 converts coordinate data on the pressedposition 259 received from theinput processing section 307 into coordinate data on the standard position, and sends it to theapplication processing section 309. - The
application processing section 309 receives coordinate data on the input position from theinput processing section 307 or the coordinateconversion section 303, and executes thedocument application 155 or thebrowsing application 157. Theapplication processing section 309 does not recognize that the display position of theapplication screen 155 b is changed by the coordinateconversion section 303. Based on an instruction from theapplication processing section 309 or the coordinateconversion section 303, the imagedata generating section 301 generates pixel data to be displayed on thedisplay 103, and outputs the pixel data to an I/O controller 123. -
FIG. 6 is a flowchart of a method for theinput system 300 to proform the screen shifting operation. Inblock 401, thescreen shifting application 159 is started with a touch panel operation to register, with the shiftingdirection determining section 311, theouter boundary 201 or theouter boundary 201 and theinner boundary 203 inFIG. 1 . Thescreen shifting application 159 displays a wizard screen on thedisplay 103 to urge the user to tap several positions on thetouch screen 101 with a thumb of the right hand and left hand in order by one-handed operation. - Coordinate data on the tap positions are sent from the
application processing section 309 to the shiftingdirection determining section 311. As an example, the shiftingdirection determining section 311 creates, from the coordinates received, data indicative of the center of the annular-shapedcomfortable operation area 205 approximated by a circular arc. Further, the shiftingdirection determining section 311 defines theouter boundary 201 or theouter boundary 201 and theinner boundary 203, which are concentric with the center of the circular arc, as circular arcs obtained by increasing/decreasing each radius at a predetermined ratio, and registers the coordinate data. - Data on the
outer boundary 201 and theinner boundary 203 may be generated directly from the coordinates of the ball of the thumb that touches the screen upon swiping with the thumb. Further, the shiftingdirection determining section 311 registers the coordinates of the reference position 255 (FIG. 4 ) on thetouch screen 101. As an example, the coordinates of thereference position 255 can be the coordinates of the lower right corner of thetouch screen 101 in the case of one-handed operation with the right hand or the coordinates of the lower left corner in the case of one-handed operation with the left hand. - Alternatively, the coordinates of the
reference position 255 can be the central coordinates of a circular arc when theouter boundary 201 and theinner boundary 203 are approximated by the circular arc. The central coordinates of the circular arc becomes a position close to the base of the thumb. The central coordinates of the circular arc may be located outside of thetouch screen 101. When the registration of the coordinates of theouter boundary 201, theinner boundary 203, and thereference position 255 for each of the right hand and the left hand is completed and thescreen shifting application 159 is shut down, the preparation of the screen shifting operation is completed. - In
block 403, the user performs a special operation to inform the system whether the hand holding thesmartphone 100 at present is the right hand or the left hand. The special operation is not particularly limited as long as it can be distinguished from a touch panel operation for an object, but it is desired that the special operation can be performed in a state of continuing the one-handed operation without switching thesmartphone 100 to the other hand. As an example, the special operation can be a gesture of swiping eachcomfortable operation area 205 with the right thumb or left thumb while pressing the thumb. - In another example, the special operation can be an operation for characteristically shaking the smartphone once or a few times while touching each comfortable operation area with the right thumb or left thumb to cause the
acceleration sensor 119 to generate an acceleration signal. Theinput processing section 307 sends the shiftingdirection determining section 311 pressure data, coordinate data, and acceleration data when the special operation is performed. From the coordinate data, the coordinate data, or the acceleration data received, the shiftingdirection determining section 311 recognizes and registers whether the hand holding the smartphone at present is the right hand or the left hand. - In
block 405, theicon 155 a is tapped on thehome screen 181 inFIG. 4A to start thedocument application 155 a. A target object displayed in thecomfortable operation area 205 can be operated with a touch panel operation. When theinner boundary 203 is not defined, a target object displayed in thedifficult operation area 207 can also be operated with a touch panel operation. If theicon 155 a is displayed in theinoperable area 209, input to theicon 155 a can also be performed by the screen shifting operation. When the tap operation is performed on theicon 155 a displayed in thecomfortable operation area 205, theapplication screen 155 is displayed on thedisplay 103 in the full-screen format as shown inFIG. 4B . - As an example, the
OS 151 defines coordinates (0, 0) at the upper left corner of thetouch screen 101. TheOS 151 defines thereference position 186 of the entire screen displayed on thetouch screen 101. Here, thereference position 186 of the entire screen is defined at the upper left corner of thesystem area 182. When theapplication 155 requests theOS 151 to display theapplication screen 155 b, theOS 151 displays theapplication screen 155 b in theclient area 184 in the full-screen format. The display position of the entire screen on thetouch screen 101 at this time is called a standard position. - In
block 407, the screen shifting operation for thetarget object 253 that configures theapplication screen 155 b displayed in theinoperable area 209 is started. When access to thecursor 155 c is performed to change the character input position, thecursor 155 c becomes the target object. Further, when thetarget object 253 is hidden from a display range of thetouch screen 101, the screen shifting operation can be performed after thetarget object 253 is scrolled and displayed in theinoperable area 209. When noinner boundary 203 is registered, the user visually assumes the virtual shiftingstraight line 257 that connects between thereference position 255 set at the corner of thetouch screen 101 and thetarget object 253, and presses a position thereon with the thumb. The pressedposition 259 naturally comes to the range of thecomfortable operation area 205. - When the
reference position 255 is defined at the center of the circular arc on thetouch screen 101, the tip of the thumb only has to be directed naturally to thetarget object 253. When theinner boundary 203 is also defined for the operationdifficult area 207, a position close to theouter boundary 201 in thecomfortable operation area 205 is pressed. When the position close to theinner boundary 203 in thecomfortable operation area 205 is pressed, the entire screen is so shifted that thedifficult operation area 207 will approach the pressedposition 259. When the screen shifting operation is performed to start shifting the entire screen by the following procedure, theinput processing section 307 sends the shiftingdirection determining section 311 the coordinate data and the pressure data until input is confirmed. - In
block 409, the shiftingdirection determining section 311 calculates the formula of the virtual shiftingstraight line 257 that connects between the coordinates of thereference position 255 on thetouch screen 101 and the coordinates of the pressedposition 259. Further, the shiftingdirection determining section 311 creates a formula of the shiftingstraight line 258 passing through the coordinates (0, 0) of thetouch screen 101 that matches thereference position 186 of the entire screen and parallel with the virtual shiftingstraight line 257, and sends the formula to the coordinateconversion section 303. Inblock 411, the coordinateconversion section 303 calculates a shifting vector from the formula of the shiftingstraight line 258 and the pressure data. The shifting vector is coordinate data on thereference position 186 of the entire screen to be shifted. - In an example of calculating the shifting vector, a lower limit value and an upper limit value are set for the pressure data received from the
pressure sensor 107 to calculate a position vector with the coordinates of thereference position 186 of the entire screen assigned between the lower limit value and the upper limit value. Specifically, thereference position 186 immediately before the pressure data exceeds the lower limit value is set to the coordinates (0, 0) of thetouch screen 101, and the coordinates of thereference position 186 when the pressure data reaches the upper limit value is set to an intersection between the shiftingstraight line 258 and the coordinates of the right end of thetouch screen 101. In this case, the coordinates of thereference position 186 during the screen shifting operation is either of the positions on the shiftingstraight line 258 in proportion to the pressing force. Thus, thetarget object 253 is bound to pass through the pressedposition 259 until the pressure data reaches the upper limit value. - In another example of calculating the shifting vector, a velocity vector corresponding to a change in the pressing force is calculated. Specifically, the shifting velocity is made to correspond to a time differential value of the pressing force. In this case, the coordinates of the
reference position 186 of the entire screen during the screen shifting operation is changed in such a manner that theapplication screen 155 b is shifted in the lower right direction when the pressure is increased, the shifting is stopped during no change in pressing force, or theapplication screen 155 b is shifted in a returning direction when the pressure is decreased. Then, the shifting velocity can be made proportional to the time differential value of the pressing force. - In
block 413, the coordinateconversion section 303 sends the imagedata generating section 301 the coordinates of thereference position 186 of the entire screen continuously every predetermined time. Each time receiving the coordinates of thereference position 186, the image data generating section 313 updates the image data to make thereference position 186 math designated coordinates, and displays the entire screen in the shifted position while maintaining screen consistency. Theapplication screen 155 b is displayed in a position shifted in a lower right direction of thetouch screen 101 along with the shifting of the entire screen. - As a result, the
touch screen 101 displays theblank screen 261 on the left edge and the upper edge, and thetarget object 253 approaches the pressedposition 259 while making the display of theapplication screen 155 b run off the lower right edges. There is a case inblock 415 where the user may change the pressedposition 259 to correct the shifting direction because the firstpressed position 259 is not appropriate. When the pressed position is changed while pressing the finger, the procedure returns to block 409. Theinput processing section 307 sends the shiftingdirection determining section 311 the coordinate data on the finger and the pressure data after the pressed position is changed. Inblock 409, the shiftingdirection determining section 311 recalculates the virtual shiftingstraight line 257 and the shiftingstraight line 258. - In
block 417, the user who visually determines that thetarget object 253 reaches the pressedposition 259 releases the finger quickly from thetouch screen 101. The reason for releasing the finger quickly is because, when the finger is released slowly after starting the screen shifting operation, the position vector or the velocity vector is recalculated without confirming the input so that the entire screen can be returned to the standard position. Therefore, it is not necessary to limit the operation for confirming input to the quick release of the finger. When detecting a sudden change in pressure, theinput processing section 307 determines that input is performed by the screen shifting operation, and sends the coordinates of the pressedposition 259 to the coordinateconversion section 303. - In
block 421, the coordinateconversion section 303 calculates the shifting amount and shifting direction of thereference position 186 of the entire screen after the start of the screen shifting operation until the input is confirmed. The coordinateconversion section 303 converts the coordinates of the pressedposition 259 shown inFIG. 4C into the coordinates of thetarget object 253 on theapplication screen 155 b displayed in the standard position ofFIG. 4B , and sends the coordinates of thetarget object 253 to theapplication processing section 309. Theapplication processing section 309 recognizes that theapplication screen 155 b is always displayed in the standard position, and performs processing in response to the fact that the input operation is performed on thetarget object 253. After sending the coordinates of thetarget object 253, the coordinateconversion section 303 requests the imagedata generating section 301 inblock 423 to display the coordinates of thereference position 186 of the entire screen to match the coordinates (0, 0) of thetouch screen 101, and waits for the next input. - In the above procedure, when the
inner boundary 203 is also defined inblock 401, if a position in the neighborhood of theinner boundary 203 is pressed to perform input to a target object displayed in thedifficult operation area 207, the shiftingdirection determining section 311 calculates the shifting direction to shift theapplication screen 155 b in an upper left direction. When the input operation to one target object is confirmed, the display position of the entire screen is returned to the standard position inblock 423. This method is convenient when the next target object is displayed in the comfortable operation area in the standard position. - Here, suppose that target objects 253 and 254 are displayed together in the
comfortable operation area 205 with one screen shifting operation. At this time, when accessing thetarget object 254 following thetarget object 253, it is convenient if input can be performed continuously without returning the entire screen to the standard position. In this case, when the input to thetarget object 253 is confirmed, the coordinateconversion section 303 can stop returning the entire screen to the standard position to perform a touch panel operation continuously on the shiftedapplication screen 155 b. - As an example of the operation at this time, when receiving an event of a quick release of the finger after the entire screen is shifted to a predetermined position, the coordinate
conversion section 303 fixes the display position of the entire screen to the coordinates at the time. Then, when the target objects 253 and 254 the pressing of which is stopped are tapped in order, the input processing section sends coordinate data to theapplication processing section 309. Further, when pressing is restarted, the shiftingdirection determining section 311 calculates a new virtual shiftingstraight line 257. Then, when the coordinateconversion section 303 shifts the entire screen again and receives an event of the quick release of the finger, the shifting of the entire screen is stopped again. Then, when receiving the event of releasing the finger slowly during the pressing operation, the coordinateconversion section 303 returns the display position of the entire screen to the standard position. - Note that the screen shifting operation can also be applied to an application screen displayed in a window format.
FIG. 7 shows a state where the screen shifting operation is performed when anapplication screen 157 b of thebrowsing application 157 is displayed in a window format. Among application screens displayed in the window format, an application screen displayed in the foreground becomes the target of the screen shifting operation, and the screens and thehome screen 181 displayed in the background are not shifted. Theapplication screen 157 b is shifted by the screen shifting operation within a range of theclient area 184, and the display runs off the edges. - On the
application screen 157 b, all characters, images, and icons with hyperlinks embedded therein can be set as target objects to perform input with the screen shifting operation. In this case, when input is performed to the next target object, theapplication screen 157 b may be returned once to the position before the start of the screen shifting operation, or may be tapped after shifting is stopped to perform input to the next target object continuously. - Since the
input system 300 can perform the screen shifting operation with a pressing operation and a touch panel operation on the surface of thedisplay 103, one-handed operation can be easily performed while maintaining stable holding. Note that the present invention can be realized without using the pressure sensor. For example, when thetouch screen 101 is pressed by a predetermined pressing force of a finger, the area of the touch of the finger can be calculated from the coordinates detected by thetouch panel 105 to generate an event for performing the screen shifting operation. - Alternatively, a special gesture can be defined for a touch panel operation to enable the screen shifting operation after the
input system 300 enters a screen shifting operation mode. For example, the shiftingdirection determining section 311 and the coordinateconversion section 303 are so configured that, while theinput system 300 is in the screen shifting operation mode, an application screen is shifted with a swipe of a finger, and when the finger is released, the display position of the screen is fixed at the position. Although the swipe creates a blank screen and causes the application screen to run off the edges, the user can perform input with a touch panel operation after shifting the entire screen or a window screen to a convenient position. - As has been described, the present disclosure provides a method for improving one-handed operability of a portable information terminal having a touch screen.
- While the disclosure has been particularly shown and described with reference to a preferred embodiment, it will be understood by those skilled in the art that various changes in form and detail may be made therein without departing from the spirit and scope of the disclosure.
Claims (13)
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2013-130513 | 2013-06-21 | ||
JP2013130513A JP5759660B2 (en) | 2013-06-21 | 2013-06-21 | Portable information terminal having touch screen and input method |
Publications (1)
Publication Number | Publication Date |
---|---|
US20140380209A1 true US20140380209A1 (en) | 2014-12-25 |
Family
ID=52112051
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US14/294,729 Abandoned US20140380209A1 (en) | 2013-06-21 | 2014-06-03 | Method for operating portable devices having a touch screen |
Country Status (2)
Country | Link |
---|---|
US (1) | US20140380209A1 (en) |
JP (1) | JP5759660B2 (en) |
Cited By (21)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20140033130A1 (en) * | 2012-07-25 | 2014-01-30 | Isis Srl | Method for controlling and activating a user interface and device and installation using such a method and interface |
US20150141085A1 (en) * | 2012-06-14 | 2015-05-21 | Zone V Ltd. | Mobile computing device for blind or low-vision users |
US20150346994A1 (en) * | 2014-05-30 | 2015-12-03 | Visa International Service Association | Method for providing a graphical user interface for an electronic transaction with a handheld touch screen device |
CN105511790A (en) * | 2015-12-09 | 2016-04-20 | 上海斐讯数据通信技术有限公司 | Touch screen control method and system of electronic equipment provided with touch screen and electronic equipment |
USD758441S1 (en) * | 2013-11-22 | 2016-06-07 | Lg Electronics Inc. | Multimedia terminal with graphical user interface |
WO2017018722A1 (en) * | 2015-07-27 | 2017-02-02 | Samsung Electronics Co., Ltd. | Screen operating method and electronic device supporting the same |
CN106687903A (en) * | 2014-12-29 | 2017-05-17 | 华为技术有限公司 | Method for reducing valid presentation region of screen and mobile terminal |
CN106951135A (en) * | 2017-02-06 | 2017-07-14 | 努比亚技术有限公司 | A kind of method and terminal for realizing the adjustment of input tool column |
US20170364196A1 (en) * | 2014-10-23 | 2017-12-21 | Zte Corporation | Touch Screen Device and Method for Operating Touch Screen Device |
US10089122B1 (en) * | 2017-07-21 | 2018-10-02 | International Business Machines Corporation | Customizing mobile device operation based on touch points |
TWI638308B (en) * | 2016-11-16 | 2018-10-11 | 騰訊科技(深圳)有限公司 | Touch screen-based control method and device |
US20190018555A1 (en) * | 2015-12-31 | 2019-01-17 | Huawei Technologies Co., Ltd. | Method for displaying menu on user interface and handheld terminal |
US10254940B2 (en) | 2017-04-19 | 2019-04-09 | International Business Machines Corporation | Modifying device content to facilitate user interaction |
US10296774B2 (en) * | 2014-09-18 | 2019-05-21 | Huawei Technologies Co., Ltd. | Fingerprint recognition apparatus |
US10372320B2 (en) * | 2015-08-17 | 2019-08-06 | Hisense Mobile Communications Technology Co., Ltd. | Device and method for operating on touch screen, and storage medium |
CN111831108A (en) * | 2019-04-19 | 2020-10-27 | 宏达国际电子股份有限公司 | Mobile device and control method thereof |
CN115048007A (en) * | 2014-12-31 | 2022-09-13 | 创新先进技术有限公司 | Device and method for adjusting distribution range of interface operation icons and touch screen equipment |
US11487425B2 (en) * | 2019-01-17 | 2022-11-01 | International Business Machines Corporation | Single-hand wide-screen smart device management |
WO2022248056A1 (en) | 2021-05-27 | 2022-12-01 | Telefonaktiebolaget Lm Ericsson (Publ) | One-handed operation of a device user interface |
EP4083751A4 (en) * | 2019-12-25 | 2022-12-21 | Sony Group Corporation | Information processing device, program, and method |
US12204706B2 (en) | 2021-05-27 | 2025-01-21 | Telefonaktiebolaget Lm Ericsson (Publ) | Backside user interface for handheld device |
Families Citing this family (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR101577277B1 (en) | 2015-02-04 | 2015-12-28 | 주식회사 하이딥 | Touch type distinguishing method and touch input device performing the same |
JP6380341B2 (en) * | 2015-11-12 | 2018-08-29 | 京セラドキュメントソリューションズ株式会社 | Operation input device and operation input method |
JP2017157079A (en) * | 2016-03-03 | 2017-09-07 | 富士通株式会社 | Information processor, display control method, and display control program |
KR102044824B1 (en) * | 2017-06-20 | 2019-11-15 | 주식회사 하이딥 | Apparatus capable of sensing touch and touch pressure and control method thereof |
CN112698756A (en) * | 2019-10-23 | 2021-04-23 | 华为终端有限公司 | Display method of user interface and electronic equipment |
Citations (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20040196267A1 (en) * | 2003-04-02 | 2004-10-07 | Fujitsu Limited | Information processing apparatus operating in touch panel mode and pointing device mode |
US20090070670A1 (en) * | 2007-09-06 | 2009-03-12 | Sharp Kabushiki Kaisha | Information display device |
US20090160805A1 (en) * | 2007-12-21 | 2009-06-25 | Kabushiki Kaisha Toshiba | Information processing apparatus and display control method |
US20130030780A1 (en) * | 2008-07-10 | 2013-01-31 | Christopher Hazard | Methods, Systems, and Computer Program Products for Simulating a Scenario by Updating Events Over a Time Window Including the Past, Present, and Future |
US20130234949A1 (en) * | 2012-03-06 | 2013-09-12 | Todd E. Chornenky | On-Screen Diagonal Keyboard |
US20130241842A1 (en) * | 2012-03-19 | 2013-09-19 | Tak-Man Ma | Method, device, and computer-readable medium for changing size of touch permissible region of touch screen |
US20140109022A1 (en) * | 2012-09-17 | 2014-04-17 | Huawei Device Co., Ltd. | Touch Operation Processing Method and Terminal Device |
US20140137036A1 (en) * | 2012-11-15 | 2014-05-15 | Weishan Han | Operation Window for Portable Devices with Touchscreen Displays |
Family Cites Families (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP5732784B2 (en) * | 2010-09-07 | 2015-06-10 | ソニー株式会社 | Information processing apparatus, information processing method, and computer program |
JP5999374B2 (en) * | 2011-09-05 | 2016-09-28 | 日本電気株式会社 | Portable terminal device, portable terminal control method, and program |
JP5779064B2 (en) * | 2011-09-28 | 2015-09-16 | 京セラ株式会社 | Apparatus, method, and program |
JP5993802B2 (en) * | 2013-05-29 | 2016-09-14 | 京セラ株式会社 | Portable device, control program, and control method in portable device |
-
2013
- 2013-06-21 JP JP2013130513A patent/JP5759660B2/en active Active
-
2014
- 2014-06-03 US US14/294,729 patent/US20140380209A1/en not_active Abandoned
Patent Citations (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20040196267A1 (en) * | 2003-04-02 | 2004-10-07 | Fujitsu Limited | Information processing apparatus operating in touch panel mode and pointing device mode |
US20090070670A1 (en) * | 2007-09-06 | 2009-03-12 | Sharp Kabushiki Kaisha | Information display device |
US20090160805A1 (en) * | 2007-12-21 | 2009-06-25 | Kabushiki Kaisha Toshiba | Information processing apparatus and display control method |
US20130030780A1 (en) * | 2008-07-10 | 2013-01-31 | Christopher Hazard | Methods, Systems, and Computer Program Products for Simulating a Scenario by Updating Events Over a Time Window Including the Past, Present, and Future |
US20130234949A1 (en) * | 2012-03-06 | 2013-09-12 | Todd E. Chornenky | On-Screen Diagonal Keyboard |
US20130241842A1 (en) * | 2012-03-19 | 2013-09-19 | Tak-Man Ma | Method, device, and computer-readable medium for changing size of touch permissible region of touch screen |
US20140109022A1 (en) * | 2012-09-17 | 2014-04-17 | Huawei Device Co., Ltd. | Touch Operation Processing Method and Terminal Device |
US20140137036A1 (en) * | 2012-11-15 | 2014-05-15 | Weishan Han | Operation Window for Portable Devices with Touchscreen Displays |
Cited By (27)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20150141085A1 (en) * | 2012-06-14 | 2015-05-21 | Zone V Ltd. | Mobile computing device for blind or low-vision users |
US20140033130A1 (en) * | 2012-07-25 | 2014-01-30 | Isis Srl | Method for controlling and activating a user interface and device and installation using such a method and interface |
USD758441S1 (en) * | 2013-11-22 | 2016-06-07 | Lg Electronics Inc. | Multimedia terminal with graphical user interface |
US20150346994A1 (en) * | 2014-05-30 | 2015-12-03 | Visa International Service Association | Method for providing a graphical user interface for an electronic transaction with a handheld touch screen device |
US10481789B2 (en) | 2014-05-30 | 2019-11-19 | Visa International Service Association | Method for providing a graphical user interface for an electronic transaction with a handheld touch screen device |
US9990126B2 (en) * | 2014-05-30 | 2018-06-05 | Visa International Service Association | Method for providing a graphical user interface for an electronic transaction with a handheld touch screen device |
US10296774B2 (en) * | 2014-09-18 | 2019-05-21 | Huawei Technologies Co., Ltd. | Fingerprint recognition apparatus |
US20170364196A1 (en) * | 2014-10-23 | 2017-12-21 | Zte Corporation | Touch Screen Device and Method for Operating Touch Screen Device |
CN106687903A (en) * | 2014-12-29 | 2017-05-17 | 华为技术有限公司 | Method for reducing valid presentation region of screen and mobile terminal |
US10318131B2 (en) * | 2014-12-29 | 2019-06-11 | Huawei Technologies Co., Ltd. | Method for scaling down effective display area of screen, and mobile terminal |
EP3232311A4 (en) * | 2014-12-29 | 2018-01-17 | Huawei Technologies Co., Ltd. | Method for reducing valid presentation region of screen and mobile terminal |
CN115048007A (en) * | 2014-12-31 | 2022-09-13 | 创新先进技术有限公司 | Device and method for adjusting distribution range of interface operation icons and touch screen equipment |
US10671243B2 (en) | 2015-07-27 | 2020-06-02 | Samsung Electronics Co., Ltd. | Screen operating method and electronic device supporting the same |
WO2017018722A1 (en) * | 2015-07-27 | 2017-02-02 | Samsung Electronics Co., Ltd. | Screen operating method and electronic device supporting the same |
US10372320B2 (en) * | 2015-08-17 | 2019-08-06 | Hisense Mobile Communications Technology Co., Ltd. | Device and method for operating on touch screen, and storage medium |
CN105511790A (en) * | 2015-12-09 | 2016-04-20 | 上海斐讯数据通信技术有限公司 | Touch screen control method and system of electronic equipment provided with touch screen and electronic equipment |
US20190018555A1 (en) * | 2015-12-31 | 2019-01-17 | Huawei Technologies Co., Ltd. | Method for displaying menu on user interface and handheld terminal |
TWI638308B (en) * | 2016-11-16 | 2018-10-11 | 騰訊科技(深圳)有限公司 | Touch screen-based control method and device |
US10866730B2 (en) | 2016-11-16 | 2020-12-15 | Tencent Technology (Shenzhen) Company Limited | Touch screen-based control method and apparatus |
CN106951135A (en) * | 2017-02-06 | 2017-07-14 | 努比亚技术有限公司 | A kind of method and terminal for realizing the adjustment of input tool column |
US10254940B2 (en) | 2017-04-19 | 2019-04-09 | International Business Machines Corporation | Modifying device content to facilitate user interaction |
US10089122B1 (en) * | 2017-07-21 | 2018-10-02 | International Business Machines Corporation | Customizing mobile device operation based on touch points |
US11487425B2 (en) * | 2019-01-17 | 2022-11-01 | International Business Machines Corporation | Single-hand wide-screen smart device management |
CN111831108A (en) * | 2019-04-19 | 2020-10-27 | 宏达国际电子股份有限公司 | Mobile device and control method thereof |
EP4083751A4 (en) * | 2019-12-25 | 2022-12-21 | Sony Group Corporation | Information processing device, program, and method |
WO2022248056A1 (en) | 2021-05-27 | 2022-12-01 | Telefonaktiebolaget Lm Ericsson (Publ) | One-handed operation of a device user interface |
US12204706B2 (en) | 2021-05-27 | 2025-01-21 | Telefonaktiebolaget Lm Ericsson (Publ) | Backside user interface for handheld device |
Also Published As
Publication number | Publication date |
---|---|
JP5759660B2 (en) | 2015-08-05 |
JP2015005173A (en) | 2015-01-08 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US20140380209A1 (en) | Method for operating portable devices having a touch screen | |
US10114494B2 (en) | Information processing apparatus, information processing method, and program | |
US8854325B2 (en) | Two-factor rotation input on a touchscreen device | |
KR101361214B1 (en) | Interface Apparatus and Method for setting scope of control area of touch screen | |
JP5507494B2 (en) | Portable electronic device with touch screen and control method | |
US8775966B2 (en) | Electronic device and method with dual mode rear TouchPad | |
CN102902469B (en) | Gesture recognition method and touch system | |
US9459704B2 (en) | Method and apparatus for providing one-handed user interface in mobile device having touch screen | |
US20110060986A1 (en) | Method for Controlling the Display of a Touch Screen, User Interface of the Touch Screen, and an Electronic Device using The Same | |
US8866776B2 (en) | Information processing device adapted to receiving an input for user control using a touch pad and information processing method thereof | |
WO2013094371A1 (en) | Display control device, display control method, and computer program | |
US20150199125A1 (en) | Displaying an application image on two or more displays | |
US20090109187A1 (en) | Information processing apparatus, launcher, activation control method and computer program product | |
KR20130052749A (en) | Touch based user interface device and methdo | |
KR20120136796A (en) | Method and apparatus for providing interface for inpputing character | |
WO2014041436A2 (en) | Processing user input pertaining to content movement | |
WO2019119799A1 (en) | Method for displaying application icon, and terminal device | |
US10671269B2 (en) | Electronic device with large-size display screen, system and method for controlling display screen | |
JP6235349B2 (en) | Display device with touch operation function | |
US20170075453A1 (en) | Terminal and terminal control method | |
TWI442305B (en) | A operation method and a system of the multi-touch | |
US20150153925A1 (en) | Method for operating gestures and method for calling cursor | |
EP3433713B1 (en) | Selecting first digital input behavior based on presence of a second, concurrent, input | |
JP2015022675A (en) | Electronic device, interface control method, and program | |
CA2806608A1 (en) | Two-factor rotation input on a touchscreen device |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment |
Owner name: LENOVO (SINGAPORE) PTE. LTD., SINGAPORE Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:TSUKAMOTO, YASUSHI;REEL/FRAME:033019/0665 Effective date: 20140407 |
|
STCV | Information on status: appeal procedure |
Free format text: NOTICE OF APPEAL FILED |
|
STCV | Information on status: appeal procedure |
Free format text: APPEAL BRIEF (OR SUPPLEMENTAL BRIEF) ENTERED AND FORWARDED TO EXAMINER |
|
STCV | Information on status: appeal procedure |
Free format text: EXAMINER'S ANSWER TO APPEAL BRIEF MAILED |
|
STCV | Information on status: appeal procedure |
Free format text: ON APPEAL -- AWAITING DECISION BY THE BOARD OF APPEALS |
|
STCV | Information on status: appeal procedure |
Free format text: BOARD OF APPEALS DECISION RENDERED |
|
STCB | Information on status: application discontinuation |
Free format text: ABANDONED -- AFTER EXAMINER'S ANSWER OR BOARD OF APPEALS DECISION |