US20200218342A1 - Personalized adaptation of virtual reality content based on eye strain context - Google Patents
Personalized adaptation of virtual reality content based on eye strain context Download PDFInfo
- Publication number
- US20200218342A1 US20200218342A1 US16/239,377 US201916239377A US2020218342A1 US 20200218342 A1 US20200218342 A1 US 20200218342A1 US 201916239377 A US201916239377 A US 201916239377A US 2020218342 A1 US2020218342 A1 US 2020218342A1
- Authority
- US
- United States
- Prior art keywords
- adaptation
- eye
- user
- content
- determining
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
- 230000006978 adaptation Effects 0.000 title claims abstract description 193
- 208000003464 asthenopia Diseases 0.000 title claims abstract description 105
- 230000033001 locomotion Effects 0.000 claims abstract description 33
- 230000004044 response Effects 0.000 claims abstract description 22
- 230000003247 decreasing effect Effects 0.000 claims abstract description 17
- 238000000034 method Methods 0.000 claims abstract description 16
- 230000000694 effects Effects 0.000 claims description 21
- 230000003993 interaction Effects 0.000 claims description 21
- 230000015654 memory Effects 0.000 claims description 18
- 238000009877 rendering Methods 0.000 claims description 18
- 238000004590 computer program Methods 0.000 claims description 13
- 230000000193 eyeblink Effects 0.000 claims description 7
- 230000008602 contraction Effects 0.000 claims description 6
- 230000004048 modification Effects 0.000 claims description 6
- 238000012986 modification Methods 0.000 claims description 6
- 230000007423 decrease Effects 0.000 claims description 5
- 210000003128 head Anatomy 0.000 description 14
- 238000010586 diagram Methods 0.000 description 12
- 230000006870 function Effects 0.000 description 9
- 230000003287 optical effect Effects 0.000 description 7
- 238000005516 engineering process Methods 0.000 description 5
- 230000005540 biological transmission Effects 0.000 description 4
- 230000008859 change Effects 0.000 description 4
- 230000004424 eye movement Effects 0.000 description 4
- 230000003068 static effect Effects 0.000 description 4
- 206010019233 Headaches Diseases 0.000 description 3
- 238000003491 array Methods 0.000 description 3
- 231100000869 headache Toxicity 0.000 description 3
- 206010050031 Muscle strain Diseases 0.000 description 2
- 238000004891 communication Methods 0.000 description 2
- 230000001419 dependent effect Effects 0.000 description 2
- 230000004438 eyesight Effects 0.000 description 2
- 230000006872 improvement Effects 0.000 description 2
- 230000002093 peripheral effect Effects 0.000 description 2
- 230000008569 process Effects 0.000 description 2
- 230000001902 propagating effect Effects 0.000 description 2
- 230000002829 reductive effect Effects 0.000 description 2
- RYGMFSIKBFXOCR-UHFFFAOYSA-N Copper Chemical compound [Cu] RYGMFSIKBFXOCR-UHFFFAOYSA-N 0.000 description 1
- 206010013774 Dry eye Diseases 0.000 description 1
- 241000406668 Loxodonta cyclotis Species 0.000 description 1
- 230000004308 accommodation Effects 0.000 description 1
- 230000006399 behavior Effects 0.000 description 1
- 229910052802 copper Inorganic materials 0.000 description 1
- 239000010949 copper Substances 0.000 description 1
- 208000002173 dizziness Diseases 0.000 description 1
- 239000000835 fiber Substances 0.000 description 1
- 238000007654 immersion Methods 0.000 description 1
- 230000000670 limiting effect Effects 0.000 description 1
- 230000007774 longterm Effects 0.000 description 1
- 238000004519 manufacturing process Methods 0.000 description 1
- 230000005055 memory storage Effects 0.000 description 1
- 201000003152 motion sickness Diseases 0.000 description 1
- 230000006855 networking Effects 0.000 description 1
- 230000008447 perception Effects 0.000 description 1
- 230000002035 prolonged effect Effects 0.000 description 1
- 230000009467 reduction Effects 0.000 description 1
- 230000002441 reversible effect Effects 0.000 description 1
- 239000004065 semiconductor Substances 0.000 description 1
- 238000006467 substitution reaction Methods 0.000 description 1
- 230000003867 tiredness Effects 0.000 description 1
- 208000016255 tiredness Diseases 0.000 description 1
Images
Classifications
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B3/00—Apparatus for testing the eyes; Instruments for examining the eyes
- A61B3/10—Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions
- A61B3/113—Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions for determining or recording eye movement
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B3/00—Apparatus for testing the eyes; Instruments for examining the eyes
- A61B3/0016—Operational features thereof
- A61B3/0025—Operational features thereof characterised by electronic signal processing, e.g. eye models
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B3/00—Apparatus for testing the eyes; Instruments for examining the eyes
- A61B3/10—Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/0093—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00 with means for monitoring data relating to the user, e.g. head-tracking, eye-tracking
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/017—Head mounted
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
- G06F3/012—Head tracking input arrangements
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
- G06F3/013—Eye tracking input arrangements
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T19/00—Manipulating 3D models or images for computer graphics
- G06T19/003—Navigation within 3D models or images
Definitions
- Embodiments of the invention relate to personalized adaptation of Virtual Reality (VR) content based on eye strain context.
- VR Virtual Reality
- a VR headset may be described as a device that may be mounted on a person's head and covers a person's eyes.
- the VR headset may be used for games, simulator, trainers, etc.
- the VR headset may provide separate images for each eye, stereo sound, and head motion tracking sensors They comprise a stereoscopic head-mounted display (providing separate images for each eye), stereo sound, and head motion tracking sensors (to allow for shifting the picture as the head moves).
- the VR headset creates a life-size, three-dimensional (3D) virtual environment and enable perception of depth and an increased field of view (width of the picture) to create a sense of immersion.
- VR technology The simplest way to use VR technology is through head mounted displays (HMD's).
- HMD's head mounted displays
- apps and headsets offering an engaging 3D experience for users, it's not surprising that many families are embracing VR technology.
- VR headsets i.e., long viewing of VR content
- eye stress i.e., long viewing of VR content
- Other problems may occur with eye dryness/focus/movement, headaches, convergence accommodation conflicts, and coordination imbalance.
- Some VR headsets come with disclaimers stating that children under the age of 13 should not use their headsets because prolonged use may negatively impact hand-eye coordination, balance, and multi-tasking ability.
- Some conventional solutions address eye tracking for medical purposes, color setting, ambient light sensing, adapting data, power efficiency, reducing backlight, color tuning, reduced frame rate during scrolling, finger shadows, network efficiency, and feed-back based adaptation.
- a computer-implemented method for personalized adaptation of VR content based on eye strain context.
- An initial eye strain context for a user while wearing a Virtual Reality (VR) headset to view VR content in a User Interface (UI) is determined.
- a UI adaptation and an intensity of the UI adaptation is identified, where the UI adaptation is any one of an object velocity back and forth adaptation, a rotation movement calibration adaptation, and an object position adaptation.
- Modified VR content is rendered in the UI by applying the UI adaptation based on the intensity of the UI adaptation.
- An updated eye strain context is determined.
- a priority weight for the UI adaptation is increased and the UI adaptation, the intensity of the UI adaptation, and the priority weight are saved in a user profile for the user.
- a computer program product for personalized adaptation of VR content based on eye strain context.
- the computer program product comprising a computer readable storage medium having program code embodied therewith, the program code executable by at least one processor to perform operations.
- An initial eye strain context for a user while wearing a Virtual Reality (VR) headset to view VR content in a User Interface (UI) is determined.
- a UI adaptation and an intensity of the UI adaptation is identified, where the UI adaptation is any one of an object velocity back and forth adaptation, a rotation movement calibration adaptation, and an object position adaptation.
- Modified VR content is rendered in the UI by applying the UI adaptation based on the intensity of the UI adaptation.
- An updated eye strain context is determined. In response to determining that the updated eye strain context indicates that eye strain has decreased, a priority weight for the UI adaptation is increased and the UI adaptation, the intensity of the UI adaptation, and the priority weight are saved in a user profile for the user.
- a computer system for personalized adaptation of VR content based on eye strain context.
- the computer system comprises one or more processors, one or more computer-readable memories and one or more computer-readable, tangible storage devices; and program instructions, stored on at least one of the one or more computer-readable, tangible storage devices for execution by at least one of the one or more processors via at least one of the one or more memories, to perform operations.
- An initial eye strain context for a user while wearing a Virtual Reality (VR) headset to view VR content in a User Interface (UI) is determined.
- a UI adaptation and an intensity of the UI adaptation is identified, where the UI adaptation is any one of an object velocity back and forth adaptation, a rotation movement calibration adaptation, and an object position adaptation.
- Modified VR content is rendered in the UI by applying the UI adaptation based on the intensity of the UI adaptation.
- An updated eye strain context is determined.
- a priority weight for the UI adaptation is increased and the UI adaptation, the intensity of the UI adaptation, and the priority weight are saved in a user profile for the user.
- FIG. 1 illustrates, in a block diagram, a computing environment of a VR headset in accordance with certain embodiments.
- FIG. 2 illustrates object velocity back and forth adaptation in accordance with certain embodiments.
- FIG. 3 illustrates rotation movement calibration adaptation in accordance with certain embodiments.
- FIG. 4 illustrates UI position adaption in accordance with certain embodiments.
- FIGS. 5A and 5B illustrate, in a flowchart, operations for adapting a UI of a VR headset in accordance with certain embodiments.
- FIG. 6 illustrates, in a flowchart, operations for using a user profile in accordance with certain embodiments.
- FIG. 7 illustrates a computing node in accordance with certain embodiments.
- Embodiments enable VR content to auto-adapt to avoid eye fatigue in users (wearers of the VR headset).
- Embodiments re-represent the VR content in a different User Interface (UI) form to suit the energy needs of the VR headset.
- UI User Interface
- Embodiments adapt the UI by considering user feedback, which helps cater to the aesthetics of the UI while also trying to achieve energy/network efficiency.
- Embodiments incorporate direct and indirect feedback to balance user comfort and energy efficiency.
- FIG. 1 illustrates, in a block diagram, a computing environment of a VR headset 50 in accordance with certain embodiments.
- the VR headset 50 includes a computing device 100 .
- the computing device 100 includes an eye activity detector 120 , an application usage time monitor 130 , an interaction monitor 140 , a stat predictor 160 , a UI adaptor 180 , and a UI display 190 .
- the UI 190 displays one or more objects for viewing by the user of the VR headset.
- the VR headset 50 may include sensors or may receive sensor data from sensors apart from the VR headset 50 .
- the sensor data forms a sensor stream 110 that is input to an eye activity detector 120 .
- the eye activity detector 120 processes the sensor stream 110 to output eye blink rate, iris enlargement, contraction event data, and an eye fatigue index 150 .
- the application usage time monitor 130 receives as input the application on/off time 112 as input and outputs an expected application usage time 152 .
- the interaction monitor 140 receives as input eye and head interaction events 114 (i.e., movements of the eye and head of the user) and outputs an interaction rate for each UI control 154 .
- the UI controls 154 may be described as application widgets.
- the state predictor 160 receives as input the eye blink rate, iris enlargement, contraction event data, and an eye fatigue index 150 , the expected application usage time 152 , and the interaction rate for each UI control 154 . Using these inputs, the state predictor 160 outputs an object velocity back and forth adaptation, a rotation movement calibration adaptation, and an object position adaptation 170 for one or more of the objects in the UI.
- the UI adaptor 180 receives the velocity, the rotation movement calibration, and the position adaptation at an object level 170 and modifies the UI displayed by the UI display 190 .
- the UI adaptor 180 personalizes and adapts the user interface of VR content (e.g., VR games) by tracking the continuous feedback from a VR user's eye strain context.
- Eye strain context may be any one or more of the following states (but not limited to): rapid eye focus change, more extreme corner eye movements, static long-time same eye-focus, less eye-blinking, and eye tears and tiredness.
- the UI adaptor 180 detects eye strain context by deriving an eye fatigue index and then determining the eye strain context.
- the eye fatigue index is derived by the eye activity detector 120 using sensors in the VR headset by tracking eye blink rate, iris enlargement/iris focus change, eye trajectory movements, and ambient lightening.
- the VR headset includes an optical sensor and a camera, which tracks eye movements and other inferences to derive the eye fatigue index. Based on these inputs and correlating these with vision do's and don'ts principles, the UI adaptor 180 determines an eye strain context of the user. Since the VR head-set is very close to the eyes, these inputs may be derived accurately.
- the eye strain context is provided as continuous feedback to the UI adaptor 180 of the VR headset 50 .
- the UI adaptor 180 renders the VR content and may include a rendering engine.
- the UI adaptor 180 adapts and personalizes the adaptation of velocity, rotation movement calibration, and positions of various objects in the VR space based on the eye fatigue index.
- the UI adaptor 180 When eye strain is not detected, the UI adaptor 180 renders the original VR content, without modification (e.g., to allow players of a VR game to enjoy the full user experience). However, when eye-strain is detected, the UI adaptor 180 adapts and renders the VR content by making one or more of the following UI adaptations: object velocity back and forth adaptation, rotation movement calibration adaptation, and object position adaptation. Additionally, the UI adaptor 180 changes the intensity of the UI adaptations based on the intensity of the eye strain.
- object velocity back and forth adaptation when objects move back (e.g., away from the user) and forth (e.g., forwards towards the user) from user eyes at a high rate of speed, the user eye focus rapidly changes, leading to eye strain. So, when eye-strain is detected, the UI adaptor 180 scales down the object velocity and the same experience is rendered through other artifacts.
- the other artifacts may include: more vibration in movement (which makes the object movement appear faster to the user), an increase in pauses in objects between object direction changes, a change in color of objects, etc.
- the object velocity back and forth adaptation may also be referred to as object velocity to and from adaptation. Thus, the object velocity back and forth adaptation decreases a velocity of an object in the VR content and adds another artifact.
- the UI adaptor 180 increases the rotation calibration index, which results in the user moving slightly, which may lead to high magnitude of rotation.
- the rotation calibration index is a scaling number. When eye strain is higher, then the rotation calibration index (scale value) is higher. For example, when the user moves the head slightly, the VR environment rotates more based on the scaling value.
- the UI adaptor 180 places user interaction widgets widely apart, leading the user to more eye wandering in the VR space, which avoids same area focus issues.
- these adaptations by the UI adaptor 180 may be made more personalized by capturing the eye fatigue index after adaption and deriving a user-acceptance score for every adaption. Based on the user-acceptance score, personalization happens.
- the eye fatigue index is derived using the optical sensor and the camera of the VR headset.
- the user-acceptance score is derived by the state predictor 160 based on a detected reduction in eye strain. In alternative embodiments, the user-acceptance score may be based on user feedback.
- the UI adaptor 180 determines eye strain context of the user and enables personalized adaptation of object velocity, rotation movement calibration, and object position of various objects in the VR space. Adaptation not only refers to avoiding/removing the object behavior that leads to eye-strain, but rendering the same experience using other modalities or forms. Thus, the user views the complete user experience, while avoiding or limiting eye strain. Due to continuous feedback, when the user has eye strain, the UI adaptor 180 identifies the context of the eye-strain and adapts the UI in a different form so that user doesn't lose the user experience and at the same time avoids eye strain (to protect their eyes). These UI adaptor 180 may roll back the adaptations to display the original VR content when the UI adaptor 180 determines that eye-strain is reduced.
- FIG. 2 illustrates object velocity back and forth adaptation in accordance with certain embodiments.
- Rendering 200 illustrates the VR content before adaptation.
- Rendering 200 includes an object 210 moving very fast to and from the user.
- Rendering 250 illustrates the VR content after adaptation.
- the UI adaptor 180 reduces movement velocity of the object 200 after detecting rapid eye focus change context, but modifies the object 210 to add spikes.
- the UI adaptor 180 scales down (slows down) object velocity and the same experience is rendered through other artifacts.
- FIG. 3 illustrates rotation movement calibration adaptation in accordance with certain embodiments.
- Rendition 300 illustrates the VR content before adaptation.
- Rendition 300 illustrates an elephant 310 and a person 320 .
- the eye movement and view movements are the same for rendition 300 .
- Rendition 350 illustrates the VR content after adaptation.
- the UI adaptor 180 has increased the calibration such that, with less eye movement, more view rotation is achieved.
- the UI adaptor 180 increases the rotation calibration index, which makes the user move eyes and/or head slightly, which may lead to an increased magnitude of rotation.
- FIG. 4 illustrates UI position adaption in accordance with certain embodiments.
- Rendition 400 illustrates the VR content before adaptation.
- the No and Yes boxes are close to each other.
- Rendition 450 illustrates the VR content after adaptation.
- the No box has been moved (i.e., it's position has been adapted in the UI).
- the No and Yes boxes are placed wider apart, leading to move eye wandering.
- the UI adaptor 180 places user interaction widgets wider apart, leading the user to more eye wandering in the VR space, which avoids same area focus issues.
- FIGS. 5A and 5B illustrate, in a flowchart, operations for adapting a UI of a VR headset in accordance with certain embodiments.
- Control begins at block 500 with the UI adaptor 180 determining an initial eye strain context that indicates that the user's eyes are strained while wearing a VR headset to view VR content in a UI.
- the UI adaptor 180 identifies a UI adaptation and an intensity of the UI adaptation, where the UI adaptation is any one of an object velocity back and forth adaptation, a rotation movement calibration adaptation, and an object position adaptation.
- the UI adaptations are generated based on processing a sensor stream to generate an eye blink rate, iris enlargement, contraction event data, and an eye fatigue index, processing an application on/off time to generate an expected application usage time, processing eye and head interaction events to generate an interaction rate for each UI control.
- the UI adaptor 180 renders modified VR content in the UI by applying the UI adaptation based on the intensity of the UI adaptation.
- the UI adaptor 180 identifies and applies multiple UI adaptations at one time.
- the UI adaptor 180 determines an updated eye strain context. In block 508 , the UI adaptor 180 determines whether the updated eye strain context indicates that the user's eye strain has decreased (i.e., the user is experiencing less eye strain). If so, processing continues to block 510 , otherwise, processing continues to block 514 ( FIG. 5B ).
- the UI adaptor 180 increases a priority weight for the UI adaptation that decreased the eye strain (i.e., improved the eye strain).
- the increase in the priority weight reflects how much of an improvement there was in eye strain (e.g., the greater the decrease in eye strain, the higher the priority weight increase).
- the UI adaptor 180 saves the UI adaptation, the intensity of the UI adaptation, and the priority weight in a user profile for the user to save personalization for user.
- the UI adaptor 180 determines whether there are any more UI adaptations available to try. If so, processing continues to block 516 , otherwise, processing is done.
- the UI adaptor 180 identifies a UI adaptation that has not been applied yet and an intensity of the UI adaptation.
- the UI adaptor 180 renders modified VR content in the UI by applying the UI adaptation based on the intensity of the UI adaptation.
- the UI adaptor 180 determines an updated eye strain context. In block 522 , the UI adaptor 180 determines whether the updated eye strain context indicates that the user's eye strain has decreased (i.e., the user is experiencing less eye strain). If so, processing continues to block 524 , otherwise, processing continues to block 514 ( FIG. 5B ).
- the UI adaptor 180 increases a priority weight for the UI adaptation that decreased the eye strain (i.e., improved the eye strain).
- the UI adaptor 180 saves the UI adaptation, the intensity of the UI adaptation, and the priority weight in the user profile for the user to save personalization for user.
- the UI adaptor 180 determines which of the UI adaptations has a highest positive response from the user, and that UI adaptation is prioritized more than a UI adaptation that has a slight positive response or a negative response form the user. This allows the UI adaptor 180 to later use the user profile to render the UI for the user for a subsequent (e.g., future) use of the VR headset to avoid eye strain.
- the UI adaptor 180 may start by applying the UI adaptation with the highest priority and may then apply the other UI adaptations, in order of priority weights.
- FIG. 6 illustrates, in a flowchart, operations for using a user profile in accordance with certain embodiments.
- Control begins at block 600 with the UI adaptor 180 determining that a user has starting using a VR headset to view VR content in a UI.
- the UI adaptor 180 determines whether a user profile is available for this user. If so, processing continues to block 604 , otherwise, processing continues to block 606 .
- the UI adaptor 180 renders modified VR content in the UI by applying a UI adaptation with a highest priority weight and based on a UI adaptation intensity in the user profile. The other UI adaptations may then be added as needed (per the processing of FIGS. 5A and 5B ).
- the UI adaptor 180 renders the VR content in the UI without modification.
- Embodiments may also be used to assess neck strain, headache, etc. and adapt the UI to lessen neck strain, headache, etc. based on input from VR headsets that provide information on the neck and head.
- FIG. 7 illustrates a computing environment 710 in accordance with certain embodiments.
- computer node 712 is only one example of a suitable computing node and is not intended to suggest any limitation as to the scope of use or functionality of embodiments of the invention described herein. Regardless, computer node 712 is capable of being implemented and/or performing any of the functionality set forth hereinabove.
- the computer node 712 may be a computer system, which is operational with numerous other general purpose or special purpose computing system environments or configurations. Examples of well-known computing systems, environments, and/or configurations that may be suitable for use with computer node 712 include, but are not limited to, personal computer systems, server computer systems, thin clients, thick clients, handheld or laptop devices, multiprocessor systems, microprocessor-based systems, set top boxes, programmable consumer electronics, network PCs, minicomputer systems, mainframe computer systems, and distributed cloud computing environments that include any of the above systems or devices, and the like.
- Computer node 712 may be described in the general context of computer system executable instructions, such as program modules, being executed by a computer system.
- program modules may include routines, programs, objects, components, logic, data structures, and so on that perform particular tasks or implement particular abstract data types.
- Computer node 712 may be practiced in distributed cloud computing environments where tasks are performed by remote processing devices that are linked through a communications network.
- program modules may be located in both local and remote computer system storage media including memory storage devices.
- computer node 712 is shown in the form of a general-purpose computing device.
- the components of computer node 712 may include, but are not limited to, one or more processors or processing units 716 , a system memory 728 , and a bus 718 that couples various system components including system memory 728 to one or more processors or processing units 716 .
- Bus 718 represents one or more of any of several types of bus structures, including a memory bus or memory controller, a peripheral bus, an accelerated graphics port, and a processor or local bus using any of a variety of bus architectures.
- bus architectures include Industry Standard Architecture (ISA) bus, Micro Channel Architecture (MCA) bus, Enhanced ISA (EISA) bus, Video Electronics Standards Association (VESA) local bus, and Peripheral Component Interconnects (PCI) bus.
- Computer node 712 typically includes a variety of computer system readable media. Such media may be any available media that is accessible by computer node 712 , and it includes both volatile and non-volatile media, removable and non-removable media.
- System memory 728 can include computer system readable media in the form of volatile memory, such as random access memory (RAM) 730 and/or cache memory 732 .
- Computer node 712 may further include other removable/non-removable, volatile/non-volatile computer system storage media.
- storage system 734 can be provided for reading from and writing to a non-removable, non-volatile magnetic media (not shown and typically called a “hard drive”).
- system memory 728 may include at least one program product having a set (e.g., at least one) of program modules that are configured to carry out the functions of embodiments of the invention.
- Program/utility 740 having a set (at least one) of program modules 742 , may be stored in system memory 728 by way of example, and not limitation, as well as an operating system, one or more application programs, other program modules, and program data. Each of the operating system, one or more application programs, other program modules, and program data or some combination thereof, may include an implementation of a networking environment.
- Program modules 742 generally carry out the functions and/or methodologies of embodiments of the invention as described herein.
- Computer node 712 may also communicate with one or more external devices 714 such as a keyboard, a pointing device, a display 724 , etc.; one or more devices that enable a user to interact with computer node 712 ; and/or any devices (e.g., network card, modem, etc.) that enable computer node 712 to communicate with one or more other computing devices. Such communication can occur via Input/Output (I/O) interfaces 722 . Still yet, computer node 712 can communicate with one or more networks such as a local area network (LAN), a general wide area network (WAN), and/or a public network (e.g., the Internet) via network adapter 720 .
- LAN local area network
- WAN wide area network
- public network e.g., the Internet
- network adapter 720 communicates with the other components of computer node 712 via bus 718 . It should be understood that although not shown, other hardware and/or software components could be used in conjunction with computer node 712 . Examples, include, but are not limited to: microcode, device drivers, redundant processing units, external disk drive arrays, RAID systems, tape drives, and data archival storage systems, etc.
- the computing device 100 has the architecture of computer node 712 .
- the present invention may be a system, a method, and/or a computer program product.
- the computer program product may include a computer readable storage medium (or media) having computer readable program instructions thereon for causing a processor to carry out aspects of the present invention.
- the computer readable storage medium can be a tangible device that can retain and store instructions for use by an instruction execution device.
- the computer readable storage medium may be, for example, but is not limited to, an electronic storage device, a magnetic storage device, an optical storage device, an electromagnetic storage device, a semiconductor storage device, or any suitable combination of the foregoing.
- a non-exhaustive list of more specific examples of the computer readable storage medium includes the following: a portable computer diskette, a hard disk, a random access memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or Flash memory), a static random access memory (SRAM), a portable compact disc read-only memory (CD-ROM), a digital versatile disk (DVD), a memory stick, a floppy disk, a mechanically encoded device such as punch-cards or raised structures in a groove having instructions recorded thereon, and any suitable combination of the foregoing.
- RAM random access memory
- ROM read-only memory
- EPROM or Flash memory erasable programmable read-only memory
- SRAM static random access memory
- CD-ROM compact disc read-only memory
- DVD digital versatile disk
- memory stick a floppy disk
- a mechanically encoded device such as punch-cards or raised structures in a groove having instructions recorded thereon
- a computer readable storage medium is not to be construed as being transitory signals per se, such as radio waves or other freely propagating electromagnetic waves, electromagnetic waves propagating through a waveguide or other transmission media (e.g., light pulses passing through a fiber-optic cable), or electrical signals transmitted through a wire.
- Computer readable program instructions described herein can be downloaded to respective computing/processing devices from a computer readable storage medium or to an external computer or external storage device via a network, for example, the Internet, a local area network, a wide area network and/or a wireless network.
- the network may comprise copper transmission cables, optical transmission fibers, wireless transmission, routers, firewalls, switches, gateway computers and/or edge servers.
- a network adapter card or network interface in each computing/processing device receives computer readable program instructions from the network and forwards the computer readable program instructions for storage in a computer readable storage medium within the respective computing/processing device.
- Computer readable program instructions for carrying out operations of the present invention may be assembler instructions, instruction-set-architecture (ISA) instructions, machine instructions, machine dependent instructions, microcode, firmware instructions, state-setting data, or either source code or object code written in any combination of one or more programming languages, including an object oriented programming language such as Smalltalk, C++ or the like, and conventional procedural programming languages, such as the “C” programming language or similar programming languages.
- the computer readable program instructions may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or server.
- the remote computer may be connected to the user's computer through any type of network, including a local area network (LAN) or a wide area network (WAN), or the connection may be made to an external computer (for example, through the Internet using an Internet Service Provider).
- electronic circuitry including, for example, programmable logic circuitry, field-programmable gate arrays (FPGA), or programmable logic arrays (PLA) may execute the computer readable program instructions by utilizing state information of the computer readable program instructions to personalize the electronic circuitry, in order to perform aspects of the present invention.
- These computer readable program instructions may be provided to a processor of a general purpose computer, special purpose computer, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions/acts specified in the flowchart and/or block diagram block or blocks.
- These computer readable program instructions may also be stored in a computer readable storage medium that can direct a computer, a programmable data processing apparatus, and/or other devices to function in a particular manner, such that the computer readable storage medium having instructions stored therein comprises an article of manufacture including instructions which implement aspects of the function/act specified in the flowchart and/or block diagram block or blocks.
- the computer readable program instructions may also be loaded onto a computer, other programmable data processing apparatus, or other device to cause a series of operational steps to be performed on the computer, other programmable apparatus or other device to produce a computer implemented process, such that the instructions which execute on the computer, other programmable apparatus, or other device implement the functions/acts specified in the flowchart and/or block diagram block or blocks.
- each block in the flowchart or block diagrams may represent a module, segment, or portion of instructions, which comprises one or more executable instructions for implementing the specified logical function(s).
- the functions noted in the block may occur out of the order noted in the figures.
- two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved.
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Health & Medical Sciences (AREA)
- Life Sciences & Earth Sciences (AREA)
- General Physics & Mathematics (AREA)
- Human Computer Interaction (AREA)
- Ophthalmology & Optometry (AREA)
- Biomedical Technology (AREA)
- Medical Informatics (AREA)
- Molecular Biology (AREA)
- Surgery (AREA)
- Animal Behavior & Ethology (AREA)
- General Health & Medical Sciences (AREA)
- Public Health (AREA)
- Veterinary Medicine (AREA)
- Heart & Thoracic Surgery (AREA)
- Biophysics (AREA)
- Optics & Photonics (AREA)
- Software Systems (AREA)
- Remote Sensing (AREA)
- Radar, Positioning & Navigation (AREA)
- Computer Graphics (AREA)
- Computer Hardware Design (AREA)
- Signal Processing (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
Description
- Embodiments of the invention relate to personalized adaptation of Virtual Reality (VR) content based on eye strain context.
- A VR headset may be described as a device that may be mounted on a person's head and covers a person's eyes. The VR headset may be used for games, simulator, trainers, etc. The VR headset may provide separate images for each eye, stereo sound, and head motion tracking sensors They comprise a stereoscopic head-mounted display (providing separate images for each eye), stereo sound, and head motion tracking sensors (to allow for shifting the picture as the head moves).
- The VR headset creates a life-size, three-dimensional (3D) virtual environment and enable perception of depth and an increased field of view (width of the picture) to create a sense of immersion.
- Each year a new toy emerges on the market challenging the newest technology and expectations. Even though VR technology is still developing, there are concerns over its long term effects on user's eyesight.
- The simplest way to use VR technology is through head mounted displays (HMD's). With a host of new games, apps and headsets offering an engaging 3D experience for users, it's not surprising that many families are embracing VR technology.
- However, continuous usage of VR headsets (i.e., long viewing of VR content) may lead to eye stress, dizziness, motion sickness. Other problems may occur with eye dryness/focus/movement, headaches, convergence accommodation conflicts, and coordination imbalance. Some VR headsets come with disclaimers stating that children under the age of 13 should not use their headsets because prolonged use may negatively impact hand-eye coordination, balance, and multi-tasking ability.
- Some conventional solutions address eye tracking for medical purposes, color setting, ambient light sensing, adapting data, power efficiency, reducing backlight, color tuning, reduced frame rate during scrolling, finger shadows, network efficiency, and feed-back based adaptation.
- In accordance with embodiments, a computer-implemented method is provided for personalized adaptation of VR content based on eye strain context. An initial eye strain context for a user while wearing a Virtual Reality (VR) headset to view VR content in a User Interface (UI) is determined. A UI adaptation and an intensity of the UI adaptation is identified, where the UI adaptation is any one of an object velocity back and forth adaptation, a rotation movement calibration adaptation, and an object position adaptation. Modified VR content is rendered in the UI by applying the UI adaptation based on the intensity of the UI adaptation. An updated eye strain context is determined. In response to determining that the updated eye strain context indicates that eye strain has decreased, a priority weight for the UI adaptation is increased and the UI adaptation, the intensity of the UI adaptation, and the priority weight are saved in a user profile for the user.
- In accordance with other embodiments, a computer program product is provided for personalized adaptation of VR content based on eye strain context. The computer program product comprising a computer readable storage medium having program code embodied therewith, the program code executable by at least one processor to perform operations. An initial eye strain context for a user while wearing a Virtual Reality (VR) headset to view VR content in a User Interface (UI) is determined. A UI adaptation and an intensity of the UI adaptation is identified, where the UI adaptation is any one of an object velocity back and forth adaptation, a rotation movement calibration adaptation, and an object position adaptation. Modified VR content is rendered in the UI by applying the UI adaptation based on the intensity of the UI adaptation. An updated eye strain context is determined. In response to determining that the updated eye strain context indicates that eye strain has decreased, a priority weight for the UI adaptation is increased and the UI adaptation, the intensity of the UI adaptation, and the priority weight are saved in a user profile for the user.
- In yet other embodiments, a computer system is provided for personalized adaptation of VR content based on eye strain context. The computer system comprises one or more processors, one or more computer-readable memories and one or more computer-readable, tangible storage devices; and program instructions, stored on at least one of the one or more computer-readable, tangible storage devices for execution by at least one of the one or more processors via at least one of the one or more memories, to perform operations. An initial eye strain context for a user while wearing a Virtual Reality (VR) headset to view VR content in a User Interface (UI) is determined. A UI adaptation and an intensity of the UI adaptation is identified, where the UI adaptation is any one of an object velocity back and forth adaptation, a rotation movement calibration adaptation, and an object position adaptation. Modified VR content is rendered in the UI by applying the UI adaptation based on the intensity of the UI adaptation. An updated eye strain context is determined. In response to determining that the updated eye strain context indicates that eye strain has decreased, a priority weight for the UI adaptation is increased and the UI adaptation, the intensity of the UI adaptation, and the priority weight are saved in a user profile for the user.
- Referring now to the drawings in which like reference numbers represent corresponding parts throughout:
-
FIG. 1 illustrates, in a block diagram, a computing environment of a VR headset in accordance with certain embodiments. -
FIG. 2 illustrates object velocity back and forth adaptation in accordance with certain embodiments. -
FIG. 3 illustrates rotation movement calibration adaptation in accordance with certain embodiments. -
FIG. 4 illustrates UI position adaption in accordance with certain embodiments. -
FIGS. 5A and 5B illustrate, in a flowchart, operations for adapting a UI of a VR headset in accordance with certain embodiments. -
FIG. 6 illustrates, in a flowchart, operations for using a user profile in accordance with certain embodiments. -
FIG. 7 illustrates a computing node in accordance with certain embodiments. - The descriptions of the various embodiments of the present invention have been presented for purposes of illustration, but are not intended to be exhaustive or limited to the embodiments disclosed. Many modifications and variations will be apparent to those of ordinary skill in the art without departing from the scope and spirit of the described embodiments. The terminology used herein was chosen to best explain the principles of the embodiments, the practical application or technical improvement over technologies found in the marketplace, or to enable others of ordinary skill in the art to understand the embodiments disclosed herein.
- Embodiments enable VR content to auto-adapt to avoid eye fatigue in users (wearers of the VR headset). Embodiments re-represent the VR content in a different User Interface (UI) form to suit the energy needs of the VR headset. Embodiments adapt the UI by considering user feedback, which helps cater to the aesthetics of the UI while also trying to achieve energy/network efficiency. Embodiments incorporate direct and indirect feedback to balance user comfort and energy efficiency.
-
FIG. 1 illustrates, in a block diagram, a computing environment of aVR headset 50 in accordance with certain embodiments. TheVR headset 50 includes acomputing device 100. Thecomputing device 100 includes aneye activity detector 120, an application usage time monitor 130, aninteraction monitor 140, astat predictor 160, aUI adaptor 180, and a UI display 190. The UI 190 displays one or more objects for viewing by the user of the VR headset. - The
VR headset 50 may include sensors or may receive sensor data from sensors apart from theVR headset 50. The sensor data forms asensor stream 110 that is input to aneye activity detector 120. Theeye activity detector 120 processes thesensor stream 110 to output eye blink rate, iris enlargement, contraction event data, and aneye fatigue index 150. - The application usage time monitor 130 receives as input the application on/off
time 112 as input and outputs an expectedapplication usage time 152. - The
interaction monitor 140 receives as input eye and head interaction events 114 (i.e., movements of the eye and head of the user) and outputs an interaction rate for eachUI control 154. In certain embodiments, the UI controls 154 may be described as application widgets. - The
state predictor 160 receives as input the eye blink rate, iris enlargement, contraction event data, and aneye fatigue index 150, the expectedapplication usage time 152, and the interaction rate for eachUI control 154. Using these inputs, thestate predictor 160 outputs an object velocity back and forth adaptation, a rotation movement calibration adaptation, and anobject position adaptation 170 for one or more of the objects in the UI. - The
UI adaptor 180 receives the velocity, the rotation movement calibration, and the position adaptation at anobject level 170 and modifies the UI displayed by the UI display 190. - The
UI adaptor 180 personalizes and adapts the user interface of VR content (e.g., VR games) by tracking the continuous feedback from a VR user's eye strain context. Eye strain context may be any one or more of the following states (but not limited to): rapid eye focus change, more extreme corner eye movements, static long-time same eye-focus, less eye-blinking, and eye tears and tiredness. - In different embodiments, different techniques may be used to detect eye-strain context. In certain embodiments, the
UI adaptor 180 detects eye strain context by deriving an eye fatigue index and then determining the eye strain context. In certain embodiments, the eye fatigue index is derived by theeye activity detector 120 using sensors in the VR headset by tracking eye blink rate, iris enlargement/iris focus change, eye trajectory movements, and ambient lightening. In certain embodiments, the VR headset includes an optical sensor and a camera, which tracks eye movements and other inferences to derive the eye fatigue index. Based on these inputs and correlating these with vision do's and don'ts principles, theUI adaptor 180 determines an eye strain context of the user. Since the VR head-set is very close to the eyes, these inputs may be derived accurately. - The eye strain context is provided as continuous feedback to the
UI adaptor 180 of theVR headset 50. TheUI adaptor 180 renders the VR content and may include a rendering engine. TheUI adaptor 180 adapts and personalizes the adaptation of velocity, rotation movement calibration, and positions of various objects in the VR space based on the eye fatigue index. - When eye strain is not detected, the
UI adaptor 180 renders the original VR content, without modification (e.g., to allow players of a VR game to enjoy the full user experience). However, when eye-strain is detected, theUI adaptor 180 adapts and renders the VR content by making one or more of the following UI adaptations: object velocity back and forth adaptation, rotation movement calibration adaptation, and object position adaptation. Additionally, theUI adaptor 180 changes the intensity of the UI adaptations based on the intensity of the eye strain. - In certain embodiments, for object velocity back and forth adaptation, when objects move back (e.g., away from the user) and forth (e.g., forwards towards the user) from user eyes at a high rate of speed, the user eye focus rapidly changes, leading to eye strain. So, when eye-strain is detected, the
UI adaptor 180 scales down the object velocity and the same experience is rendered through other artifacts. The other artifacts may include: more vibration in movement (which makes the object movement appear faster to the user), an increase in pauses in objects between object direction changes, a change in color of objects, etc. The object velocity back and forth adaptation may also be referred to as object velocity to and from adaptation. Thus, the object velocity back and forth adaptation decreases a velocity of an object in the VR content and adds another artifact. - In certain embodiments, for rotation movement calibration adaptation (with regard to horizontal and vertical extreme), a user may find difficulties in viewing objects in extreme corners (i.e. left/right/top/bottom most) as it leads to more movement of eyes and/or the head towards corners, which could lead to eye stress if head rotations are not noticed. In these cases, the
UI adaptor 180 increases the rotation calibration index, which results in the user moving slightly, which may lead to high magnitude of rotation. In certain embodiments, the rotation calibration index is a scaling number. When eye strain is higher, then the rotation calibration index (scale value) is higher. For example, when the user moves the head slightly, the VR environment rotates more based on the scaling value. - In certain embodiments, for object position adaption, when user focus is more static in a defined area, it may lead to eye stress due to same point focus issues. So, in these contexts, the
UI adaptor 180 places user interaction widgets widely apart, leading the user to more eye wandering in the VR space, which avoids same area focus issues. - With embodiments, these adaptations by the
UI adaptor 180 may be made more personalized by capturing the eye fatigue index after adaption and deriving a user-acceptance score for every adaption. Based on the user-acceptance score, personalization happens. In certain embodiments, the eye fatigue index is derived using the optical sensor and the camera of the VR headset. In certain embodiments, the user-acceptance score is derived by thestate predictor 160 based on a detected reduction in eye strain. In alternative embodiments, the user-acceptance score may be based on user feedback. - With embodiments, the
UI adaptor 180 determines eye strain context of the user and enables personalized adaptation of object velocity, rotation movement calibration, and object position of various objects in the VR space. Adaptation not only refers to avoiding/removing the object behavior that leads to eye-strain, but rendering the same experience using other modalities or forms. Thus, the user views the complete user experience, while avoiding or limiting eye strain. Due to continuous feedback, when the user has eye strain, theUI adaptor 180 identifies the context of the eye-strain and adapts the UI in a different form so that user doesn't lose the user experience and at the same time avoids eye strain (to protect their eyes). TheseUI adaptor 180 may roll back the adaptations to display the original VR content when theUI adaptor 180 determines that eye-strain is reduced. -
FIG. 2 illustrates object velocity back and forth adaptation in accordance with certain embodiments. Rendering 200 illustrates the VR content before adaptation. Rendering 200 includes anobject 210 moving very fast to and from the user. Rendering 250 illustrates the VR content after adaptation. Inrendering 250, theUI adaptor 180 reduces movement velocity of theobject 200 after detecting rapid eye focus change context, but modifies theobject 210 to add spikes. - When objects move back and form from the user's eyes at a high rate of speed, the user's eye focus rapidly changes, which leads to eye strain. When such eye-strain is detected, the
UI adaptor 180 scales down (slows down) object velocity and the same experience is rendered through other artifacts. -
FIG. 3 illustrates rotation movement calibration adaptation in accordance with certain embodiments.Rendition 300 illustrates the VR content before adaptation.Rendition 300 illustrates anelephant 310 and aperson 320. The eye movement and view movements are the same forrendition 300.Rendition 350 illustrates the VR content after adaptation. Inrendition 350, theUI adaptor 180 has increased the calibration such that, with less eye movement, more view rotation is achieved. - Users find difficulties in viewing objects in extreme corners, which leads to more movement of the eye and/or head towards corners, which could lead to eye stress if head rotations are not noticed. In these cases, the
UI adaptor 180 increases the rotation calibration index, which makes the user move eyes and/or head slightly, which may lead to an increased magnitude of rotation. -
FIG. 4 illustrates UI position adaption in accordance with certain embodiments.Rendition 400 illustrates the VR content before adaptation. Inrendition 400, the No and Yes boxes are close to each other.Rendition 450 illustrates the VR content after adaptation. Inrendition 450, the No box has been moved (i.e., it's position has been adapted in the UI). In particular, the No and Yes boxes are placed wider apart, leading to move eye wandering. - When the user focus is more static in a defined area, this may lead to eye stress due to same point focus issues. In these contexts, the
UI adaptor 180 places user interaction widgets wider apart, leading the user to more eye wandering in the VR space, which avoids same area focus issues. -
FIGS. 5A and 5B illustrate, in a flowchart, operations for adapting a UI of a VR headset in accordance with certain embodiments. Control begins atblock 500 with theUI adaptor 180 determining an initial eye strain context that indicates that the user's eyes are strained while wearing a VR headset to view VR content in a UI. - In
block 502, theUI adaptor 180 identifies a UI adaptation and an intensity of the UI adaptation, where the UI adaptation is any one of an object velocity back and forth adaptation, a rotation movement calibration adaptation, and an object position adaptation. In certain embodiments, the UI adaptations are generated based on processing a sensor stream to generate an eye blink rate, iris enlargement, contraction event data, and an eye fatigue index, processing an application on/off time to generate an expected application usage time, processing eye and head interaction events to generate an interaction rate for each UI control. - In
block 504, theUI adaptor 180 renders modified VR content in the UI by applying the UI adaptation based on the intensity of the UI adaptation. In certain embodiments, theUI adaptor 180 identifies and applies multiple UI adaptations at one time. - In
block 506, theUI adaptor 180 determines an updated eye strain context. Inblock 508, theUI adaptor 180 determines whether the updated eye strain context indicates that the user's eye strain has decreased (i.e., the user is experiencing less eye strain). If so, processing continues to block 510, otherwise, processing continues to block 514 (FIG. 5B ). - In
block 510, theUI adaptor 180 increases a priority weight for the UI adaptation that decreased the eye strain (i.e., improved the eye strain). In certain embodiments, the increase in the priority weight reflects how much of an improvement there was in eye strain (e.g., the greater the decrease in eye strain, the higher the priority weight increase). Inblock 512, theUI adaptor 180 saves the UI adaptation, the intensity of the UI adaptation, and the priority weight in a user profile for the user to save personalization for user. - In
block 514, theUI adaptor 180 determines whether there are any more UI adaptations available to try. If so, processing continues to block 516, otherwise, processing is done. - In
block 516, theUI adaptor 180 identifies a UI adaptation that has not been applied yet and an intensity of the UI adaptation. Inblock 518, theUI adaptor 180 renders modified VR content in the UI by applying the UI adaptation based on the intensity of the UI adaptation. - In
block 520, theUI adaptor 180 determines an updated eye strain context. Inblock 522, theUI adaptor 180 determines whether the updated eye strain context indicates that the user's eye strain has decreased (i.e., the user is experiencing less eye strain). If so, processing continues to block 524, otherwise, processing continues to block 514 (FIG. 5B ). - In
block 524, theUI adaptor 180 increases a priority weight for the UI adaptation that decreased the eye strain (i.e., improved the eye strain). Inblock 526, theUI adaptor 180 saves the UI adaptation, the intensity of the UI adaptation, and the priority weight in the user profile for the user to save personalization for user. - By iterating through the possible UI adaptations, the
UI adaptor 180 determines which of the UI adaptations has a highest positive response from the user, and that UI adaptation is prioritized more than a UI adaptation that has a slight positive response or a negative response form the user. This allows theUI adaptor 180 to later use the user profile to render the UI for the user for a subsequent (e.g., future) use of the VR headset to avoid eye strain. In certain embodiments, when theUI adaptor 180 uses the user profile to render the VR content, theUI adaptor 180 may start by applying the UI adaptation with the highest priority and may then apply the other UI adaptations, in order of priority weights. -
FIG. 6 illustrates, in a flowchart, operations for using a user profile in accordance with certain embodiments. Control begins atblock 600 with theUI adaptor 180 determining that a user has starting using a VR headset to view VR content in a UI. Inblock 602, theUI adaptor 180 determines whether a user profile is available for this user. If so, processing continues to block 604, otherwise, processing continues to block 606. - In
block 604, theUI adaptor 180 renders modified VR content in the UI by applying a UI adaptation with a highest priority weight and based on a UI adaptation intensity in the user profile. The other UI adaptations may then be added as needed (per the processing ofFIGS. 5A and 5B ). Inblock 606, theUI adaptor 180 renders the VR content in the UI without modification. - Embodiments may also be used to assess neck strain, headache, etc. and adapt the UI to lessen neck strain, headache, etc. based on input from VR headsets that provide information on the neck and head.
- Conventional solutions are focused on data-independent adaptation of the User Interface (UI) and tuning of the VR system. For instance, given a web page, the adaptation of its style has been the focus. Also, conventional solutions do not provide any specific content adaptation solution tailored to a VR environment. Unlike such conventional solutions, embodiments provide data-dependent adaptation of the VR content.
- Also, conventional energy/network efficiency solutions thrust an adapted UI upon the user without feedback on whether it is better or not. On the other hand, embodiments use feedback for adaptation of the VR content.
- Moreover, conventional solutions evaluate a representative energy efficient UI in user studies, which cannot be generalized for all web pages in the world. However, embodiments may be used with any VR headset and any VR content.
- Conventional feedback based techniques do not adapt the UI with the goal of energy efficiency and the feedback is not focused on the effect of adapted UI on user comfort. On the other hand, embodiments adapt the VR content with the goal of energy efficiency and use feedback that is focused on the effect of the adapted VR content on user comfort (e.g., eye strain).
-
FIG. 7 illustrates acomputing environment 710 in accordance with certain embodiments. Referring toFIG. 7 ,computer node 712 is only one example of a suitable computing node and is not intended to suggest any limitation as to the scope of use or functionality of embodiments of the invention described herein. Regardless,computer node 712 is capable of being implemented and/or performing any of the functionality set forth hereinabove. - The
computer node 712 may be a computer system, which is operational with numerous other general purpose or special purpose computing system environments or configurations. Examples of well-known computing systems, environments, and/or configurations that may be suitable for use withcomputer node 712 include, but are not limited to, personal computer systems, server computer systems, thin clients, thick clients, handheld or laptop devices, multiprocessor systems, microprocessor-based systems, set top boxes, programmable consumer electronics, network PCs, minicomputer systems, mainframe computer systems, and distributed cloud computing environments that include any of the above systems or devices, and the like. -
Computer node 712 may be described in the general context of computer system executable instructions, such as program modules, being executed by a computer system. Generally, program modules may include routines, programs, objects, components, logic, data structures, and so on that perform particular tasks or implement particular abstract data types.Computer node 712 may be practiced in distributed cloud computing environments where tasks are performed by remote processing devices that are linked through a communications network. In a distributed cloud computing environment, program modules may be located in both local and remote computer system storage media including memory storage devices. - As shown in
FIG. 7 ,computer node 712 is shown in the form of a general-purpose computing device. The components ofcomputer node 712 may include, but are not limited to, one or more processors orprocessing units 716, asystem memory 728, and abus 718 that couples various system components includingsystem memory 728 to one or more processors orprocessing units 716. -
Bus 718 represents one or more of any of several types of bus structures, including a memory bus or memory controller, a peripheral bus, an accelerated graphics port, and a processor or local bus using any of a variety of bus architectures. By way of example, and not limitation, such architectures include Industry Standard Architecture (ISA) bus, Micro Channel Architecture (MCA) bus, Enhanced ISA (EISA) bus, Video Electronics Standards Association (VESA) local bus, and Peripheral Component Interconnects (PCI) bus. -
Computer node 712 typically includes a variety of computer system readable media. Such media may be any available media that is accessible bycomputer node 712, and it includes both volatile and non-volatile media, removable and non-removable media. -
System memory 728 can include computer system readable media in the form of volatile memory, such as random access memory (RAM) 730 and/orcache memory 732.Computer node 712 may further include other removable/non-removable, volatile/non-volatile computer system storage media. By way of example only,storage system 734 can be provided for reading from and writing to a non-removable, non-volatile magnetic media (not shown and typically called a “hard drive”). Although not shown, a magnetic disk drive for reading from and writing to a removable, non-volatile magnetic disk (e.g., a “floppy disk”), and an optical disk drive for reading from or writing to a removable, non-volatile optical disk such as a CD-ROM, DVD-ROM or other optical media can be provided. In such instances, each can be connected tobus 718 by one or more data media interfaces. As will be further depicted and described below,system memory 728 may include at least one program product having a set (e.g., at least one) of program modules that are configured to carry out the functions of embodiments of the invention. - Program/
utility 740, having a set (at least one) ofprogram modules 742, may be stored insystem memory 728 by way of example, and not limitation, as well as an operating system, one or more application programs, other program modules, and program data. Each of the operating system, one or more application programs, other program modules, and program data or some combination thereof, may include an implementation of a networking environment.Program modules 742 generally carry out the functions and/or methodologies of embodiments of the invention as described herein. -
Computer node 712 may also communicate with one or moreexternal devices 714 such as a keyboard, a pointing device, adisplay 724, etc.; one or more devices that enable a user to interact withcomputer node 712; and/or any devices (e.g., network card, modem, etc.) that enablecomputer node 712 to communicate with one or more other computing devices. Such communication can occur via Input/Output (I/O) interfaces 722. Still yet,computer node 712 can communicate with one or more networks such as a local area network (LAN), a general wide area network (WAN), and/or a public network (e.g., the Internet) vianetwork adapter 720. As depicted,network adapter 720 communicates with the other components ofcomputer node 712 viabus 718. It should be understood that although not shown, other hardware and/or software components could be used in conjunction withcomputer node 712. Examples, include, but are not limited to: microcode, device drivers, redundant processing units, external disk drive arrays, RAID systems, tape drives, and data archival storage systems, etc. - In certain embodiments, the
computing device 100 has the architecture ofcomputer node 712. - The present invention may be a system, a method, and/or a computer program product. The computer program product may include a computer readable storage medium (or media) having computer readable program instructions thereon for causing a processor to carry out aspects of the present invention.
- The computer readable storage medium can be a tangible device that can retain and store instructions for use by an instruction execution device. The computer readable storage medium may be, for example, but is not limited to, an electronic storage device, a magnetic storage device, an optical storage device, an electromagnetic storage device, a semiconductor storage device, or any suitable combination of the foregoing. A non-exhaustive list of more specific examples of the computer readable storage medium includes the following: a portable computer diskette, a hard disk, a random access memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or Flash memory), a static random access memory (SRAM), a portable compact disc read-only memory (CD-ROM), a digital versatile disk (DVD), a memory stick, a floppy disk, a mechanically encoded device such as punch-cards or raised structures in a groove having instructions recorded thereon, and any suitable combination of the foregoing. A computer readable storage medium, as used herein, is not to be construed as being transitory signals per se, such as radio waves or other freely propagating electromagnetic waves, electromagnetic waves propagating through a waveguide or other transmission media (e.g., light pulses passing through a fiber-optic cable), or electrical signals transmitted through a wire.
- Computer readable program instructions described herein can be downloaded to respective computing/processing devices from a computer readable storage medium or to an external computer or external storage device via a network, for example, the Internet, a local area network, a wide area network and/or a wireless network. The network may comprise copper transmission cables, optical transmission fibers, wireless transmission, routers, firewalls, switches, gateway computers and/or edge servers. A network adapter card or network interface in each computing/processing device receives computer readable program instructions from the network and forwards the computer readable program instructions for storage in a computer readable storage medium within the respective computing/processing device.
- Computer readable program instructions for carrying out operations of the present invention may be assembler instructions, instruction-set-architecture (ISA) instructions, machine instructions, machine dependent instructions, microcode, firmware instructions, state-setting data, or either source code or object code written in any combination of one or more programming languages, including an object oriented programming language such as Smalltalk, C++ or the like, and conventional procedural programming languages, such as the “C” programming language or similar programming languages. The computer readable program instructions may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or server. In the latter scenario, the remote computer may be connected to the user's computer through any type of network, including a local area network (LAN) or a wide area network (WAN), or the connection may be made to an external computer (for example, through the Internet using an Internet Service Provider). In some embodiments, electronic circuitry including, for example, programmable logic circuitry, field-programmable gate arrays (FPGA), or programmable logic arrays (PLA) may execute the computer readable program instructions by utilizing state information of the computer readable program instructions to personalize the electronic circuitry, in order to perform aspects of the present invention.
- Aspects of the present invention are described herein with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems), and computer program products according to embodiments of the invention. It will be understood that each block of the flowchart illustrations and/or block diagrams, and combinations of blocks in the flowchart illustrations and/or block diagrams, can be implemented by computer readable program instructions.
- These computer readable program instructions may be provided to a processor of a general purpose computer, special purpose computer, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions/acts specified in the flowchart and/or block diagram block or blocks. These computer readable program instructions may also be stored in a computer readable storage medium that can direct a computer, a programmable data processing apparatus, and/or other devices to function in a particular manner, such that the computer readable storage medium having instructions stored therein comprises an article of manufacture including instructions which implement aspects of the function/act specified in the flowchart and/or block diagram block or blocks.
- The computer readable program instructions may also be loaded onto a computer, other programmable data processing apparatus, or other device to cause a series of operational steps to be performed on the computer, other programmable apparatus or other device to produce a computer implemented process, such that the instructions which execute on the computer, other programmable apparatus, or other device implement the functions/acts specified in the flowchart and/or block diagram block or blocks.
- The flowchart and block diagrams in the Figures illustrate the architecture, functionality, and operation of possible implementations of systems, methods, and computer program products according to various embodiments of the present invention. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of instructions, which comprises one or more executable instructions for implementing the specified logical function(s). In some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams and/or flowchart illustration, and combinations of blocks in the block diagrams and/or flowchart illustration, can be implemented by special purpose hardware-based systems that perform the specified functions or acts or carry out combinations of special purpose hardware and computer instructions.
- The foregoing description provides examples of embodiments of the invention, and variations and substitutions may be made in other embodiments.
Claims (18)
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US16/239,377 US10831266B2 (en) | 2019-01-03 | 2019-01-03 | Personalized adaptation of virtual reality content based on eye strain context |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US16/239,377 US10831266B2 (en) | 2019-01-03 | 2019-01-03 | Personalized adaptation of virtual reality content based on eye strain context |
Publications (2)
Publication Number | Publication Date |
---|---|
US20200218342A1 true US20200218342A1 (en) | 2020-07-09 |
US10831266B2 US10831266B2 (en) | 2020-11-10 |
Family
ID=71405029
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US16/239,377 Expired - Fee Related US10831266B2 (en) | 2019-01-03 | 2019-01-03 | Personalized adaptation of virtual reality content based on eye strain context |
Country Status (1)
Country | Link |
---|---|
US (1) | US10831266B2 (en) |
Cited By (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US11375170B2 (en) * | 2019-07-28 | 2022-06-28 | Google Llc | Methods, systems, and media for rendering immersive video content with foveated meshes |
US20220253125A1 (en) * | 2021-02-11 | 2022-08-11 | Facebook Technologies, Llc | Adaptable Personal User Interfaces in Cross-Application Virtual Reality Settings |
WO2023183980A1 (en) * | 2022-03-30 | 2023-10-05 | ResMed Pty Ltd | Display system and user interface |
CN118662092A (en) * | 2024-06-20 | 2024-09-20 | 南通大学 | Eye data analysis system |
US12175603B2 (en) | 2022-09-29 | 2024-12-24 | Meta Platforms Technologies, Llc | Doors for artificial reality universe traversal |
US12218944B1 (en) | 2022-10-10 | 2025-02-04 | Meta Platform Technologies, LLC | Group travel between artificial reality destinations |
US12266061B2 (en) | 2022-06-22 | 2025-04-01 | Meta Platforms Technologies, Llc | Virtual personal interface for control and travel between virtual worlds |
US12277301B2 (en) | 2022-08-18 | 2025-04-15 | Meta Platforms Technologies, Llc | URL access to assets within an artificial reality universe on both 2D and artificial reality interfaces |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20140347265A1 (en) * | 2013-03-15 | 2014-11-27 | Interaxon Inc. | Wearable computing apparatus and method |
US20160232715A1 (en) * | 2015-02-10 | 2016-08-11 | Fangwei Lee | Virtual reality and augmented reality control with mobile devices |
US20170160798A1 (en) * | 2015-12-08 | 2017-06-08 | Oculus Vr, Llc | Focus adjustment method for a virtual reality headset |
US20190045125A1 (en) * | 2017-08-04 | 2019-02-07 | Nokia Technologies Oy | Virtual reality video processing |
US20190265785A1 (en) * | 2018-12-17 | 2019-08-29 | Intel Corporation | Virtual reality adaptive display control |
Family Cites Families (17)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6389437B2 (en) | 1998-01-07 | 2002-05-14 | Ion Systems, Inc. | System for converting scrolling display to non-scrolling columnar display |
KR100661659B1 (en) | 2005-06-07 | 2006-12-26 | 삼성전자주식회사 | Display device and control method thereof |
WO2010051037A1 (en) | 2008-11-03 | 2010-05-06 | Bruce Reiner | Visually directed human-computer interaction for medical applications |
CN101815127B (en) | 2010-04-12 | 2014-08-20 | 中兴通讯股份有限公司 | Mobile terminal and method for adjusting screen visual effect thereof |
US20120056910A1 (en) | 2010-08-30 | 2012-03-08 | Qualcomm Incorporated | Calibration of display for color response shifts at different luminance settings and for cross-talk between channels |
US8860653B2 (en) | 2010-09-01 | 2014-10-14 | Apple Inc. | Ambient light sensing technique |
KR20120030639A (en) | 2010-09-20 | 2012-03-29 | 삼성전자주식회사 | Display apparatus and image processing method of the same |
GB201103705D0 (en) | 2011-03-04 | 2011-04-20 | Smoker Elizabeth A | Apparatus for, and method of, detecting, measuring and assessing operator fatigue |
US8749737B2 (en) | 2011-05-09 | 2014-06-10 | Apple Inc. | Display with color control |
CN103281959A (en) | 2011-05-20 | 2013-09-04 | 松下电器产业株式会社 | Visual fatigue-easuring apparatus, method thereof, visual fatigue-measuring system and three-dimensional glasses |
US9183806B2 (en) | 2011-06-23 | 2015-11-10 | Verizon Patent And Licensing Inc. | Adjusting font sizes |
US9183812B2 (en) | 2013-01-29 | 2015-11-10 | Pixtronix, Inc. | Ambient light aware display apparatus |
US9262690B2 (en) | 2013-08-27 | 2016-02-16 | Htc Corporation | Method and device for detecting glare pixels of image |
US9125274B1 (en) | 2014-06-05 | 2015-09-01 | Osram Sylvania, Inc. | Lighting control techniques considering changes in eye sensitivity |
US9478157B2 (en) | 2014-11-17 | 2016-10-25 | Apple Inc. | Ambient light adaptive displays |
CN104503683A (en) | 2014-12-01 | 2015-04-08 | 小米科技有限责任公司 | Eyesight protecting method and device |
US9942532B2 (en) | 2015-11-03 | 2018-04-10 | International Business Machines Corporation | Eye-fatigue reduction system for head-mounted displays |
-
2019
- 2019-01-03 US US16/239,377 patent/US10831266B2/en not_active Expired - Fee Related
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20140347265A1 (en) * | 2013-03-15 | 2014-11-27 | Interaxon Inc. | Wearable computing apparatus and method |
US20160232715A1 (en) * | 2015-02-10 | 2016-08-11 | Fangwei Lee | Virtual reality and augmented reality control with mobile devices |
US20170160798A1 (en) * | 2015-12-08 | 2017-06-08 | Oculus Vr, Llc | Focus adjustment method for a virtual reality headset |
US20190045125A1 (en) * | 2017-08-04 | 2019-02-07 | Nokia Technologies Oy | Virtual reality video processing |
US20190265785A1 (en) * | 2018-12-17 | 2019-08-29 | Intel Corporation | Virtual reality adaptive display control |
Cited By (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US11375170B2 (en) * | 2019-07-28 | 2022-06-28 | Google Llc | Methods, systems, and media for rendering immersive video content with foveated meshes |
US20220253125A1 (en) * | 2021-02-11 | 2022-08-11 | Facebook Technologies, Llc | Adaptable Personal User Interfaces in Cross-Application Virtual Reality Settings |
US11556169B2 (en) * | 2021-02-11 | 2023-01-17 | Meta Platforms Technologies, Llc | Adaptable personal user interfaces in cross-application virtual reality settings |
WO2023183980A1 (en) * | 2022-03-30 | 2023-10-05 | ResMed Pty Ltd | Display system and user interface |
US12266061B2 (en) | 2022-06-22 | 2025-04-01 | Meta Platforms Technologies, Llc | Virtual personal interface for control and travel between virtual worlds |
US12277301B2 (en) | 2022-08-18 | 2025-04-15 | Meta Platforms Technologies, Llc | URL access to assets within an artificial reality universe on both 2D and artificial reality interfaces |
US12175603B2 (en) | 2022-09-29 | 2024-12-24 | Meta Platforms Technologies, Llc | Doors for artificial reality universe traversal |
US12218944B1 (en) | 2022-10-10 | 2025-02-04 | Meta Platform Technologies, LLC | Group travel between artificial reality destinations |
CN118662092A (en) * | 2024-06-20 | 2024-09-20 | 南通大学 | Eye data analysis system |
Also Published As
Publication number | Publication date |
---|---|
US10831266B2 (en) | 2020-11-10 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US10831266B2 (en) | Personalized adaptation of virtual reality content based on eye strain context | |
EP3394835B1 (en) | Adjusting video rendering rate of virtual reality content and processing of a stereoscopic image | |
EP3465680B1 (en) | Automatic audio attenuation on immersive display devices | |
KR102159849B1 (en) | Mixed reality display accommodation | |
US10451875B2 (en) | Smart transparency for virtual objects | |
EP2972559B1 (en) | Methods and apparatus for displaying images on a head mounted display | |
US20160027212A1 (en) | Anti-trip when immersed in a virtual reality environment | |
JP2008257127A (en) | Image display device and image display method | |
US20160260255A1 (en) | Filtering information within augmented reality overlays | |
GB2532954A (en) | Display control system for an augmented reality display system | |
CN112384954B (en) | 3-D Transition | |
JP6787622B2 (en) | Head-mounted display update buffer | |
TW201702807A (en) | Method and device for processing a part of an immersive video content according to the position of reference parts | |
WO2017181588A1 (en) | Method and electronic apparatus for positioning display page | |
CA3199085A1 (en) | Methods and systems of extended reality environment interaction based on eye motions | |
US10748003B2 (en) | Mitigation of augmented reality markup blindness | |
CN107479692B (en) | Virtual reality scene control method and device and virtual reality device | |
US10360704B2 (en) | Techniques for providing dynamic multi-layer rendering in graphics processing | |
US11288873B1 (en) | Blur prediction for head mounted devices | |
KR102286517B1 (en) | Control method of rotating drive dependiong on controller input and head-mounted display using the same | |
JP2024122801A (en) | Processor, image processing method, and image processing program | |
WO2024176568A1 (en) | Information processing method, information processing device, and program | |
TW202001695A (en) | Method and device for predicting trajectory |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
FEPP | Fee payment procedure |
Free format text: ENTITY STATUS SET TO UNDISCOUNTED (ORIGINAL EVENT CODE: BIG.); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY |
|
AS | Assignment |
Owner name: INTERNATIONAL BUSINESS MACHINES CORPORATION, NEW YORK Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:MURALI, SRIKANTH K.;ANANTHAPURBACHE, VIJAY KUMAR;EKAMBARAM, VIJAY;AND OTHERS;SIGNING DATES FROM 20181121 TO 20181129;REEL/FRAME:047945/0072 |
|
AS | Assignment |
Owner name: INTERNATIONAL BUSINESS MACHINES CORPORATION, NEW YORK Free format text: CORRECTIVE ASSIGNMENT TO CORRECT THE CORRECTIVE ASSIGNMENT TO RE-RECORED ASSIGNMENT PREVIOUSLY RECORDED AT REEL: 047945 FRAME: 0072. ASSIGNOR(S) HEREBY CONFIRMS THE ASSIGNMENT;ASSIGNORS:MURALI, SRIKANTH K.;ANANTHAPUR BACHE, VIJAY KUMAR;EKAMBARAM, VIJAY;AND OTHERS;SIGNING DATES FROM 20181121 TO 20181129;REEL/FRAME:050630/0222 |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: NOTICE OF ALLOWANCE MAILED -- APPLICATION RECEIVED IN OFFICE OF PUBLICATIONS |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: PUBLICATIONS -- ISSUE FEE PAYMENT RECEIVED |
|
STCF | Information on status: patent grant |
Free format text: PATENTED CASE |
|
LAPS | Lapse for failure to pay maintenance fees |
Free format text: PATENT EXPIRED FOR FAILURE TO PAY MAINTENANCE FEES (ORIGINAL EVENT CODE: EXP.); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY |
|
STCH | Information on status: patent discontinuation |
Free format text: PATENT EXPIRED DUE TO NONPAYMENT OF MAINTENANCE FEES UNDER 37 CFR 1.362 |
|
FP | Lapsed due to failure to pay maintenance fee |
Effective date: 20241110 |