{"id":736,"date":"2025-07-01T10:56:31","date_gmt":"2025-07-01T08:56:31","guid":{"rendered":"https:\/\/luminous-horizon.eu\/?page_id=736"},"modified":"2025-07-02T14:46:29","modified_gmt":"2025-07-02T12:46:29","slug":"360-degree-imaging-technology-for-next-generation-xr","status":"publish","type":"page","link":"https:\/\/luminous-horizon.eu\/index.php\/blogs\/360-degree-imaging-technology-for-next-generation-xr\/","title":{"rendered":"360-Degree Imaging Technology for Next-Generation XR"},"content":{"rendered":"\n<p class=\"has-x-large-font-size\">360-Degree Imaging Technology for Next-Generation XR<\/p>\n\n\n\n<p><strong>A 360-degree wearable camera and recognition AI that can inform users about their surroundings<\/strong>&nbsp;<\/p>\n\n\n\n<ul class=\"wp-block-list\">\n<li class=\"has-large-font-size\"><strong>Background<\/strong>&nbsp;<\/li>\n<\/ul>\n\n\n\n<p>In situations where extended reality (XR) is used to a high degree, such as daily living assistance, rehabilitation of the disabled, immersive educational and training environments, and in professional and consumer telepresence applications, XR systems must respond according to the user\u2019s expectations and present information in a way that reflects the user\u2019s perceptions.&nbsp;<\/p>\n\n\n\n<p>However, the use of common cameras for XR creates problems such as missing nearby events or not being able to continuously track objects due to the limited field of view. This makes it difficult to gain a more comprehensive understanding of the wearer\u2019s surroundings.&nbsp;<\/p>\n\n\n\n<ul class=\"wp-block-list\">\n<li class=\"has-large-font-size\"><strong>Solutions<\/strong>&nbsp;<\/li>\n<\/ul>\n\n\n\n<p>Ricoh has developed the world\u2019s first fully egocentric<sup>*1<\/sup> 360-degree wearable camera (as of June 2025, based on Ricoh research). Although there are currently some 360-degree cameras available that are worn around the neck or on helmets, this is the world\u2019s first 360-degree wearable camera that is worn close to the wearer\u2019s eyes. It allows for 360-degree recognition of surroundings. Equipped with a microphone and speaker, it also allows for voice interaction with LLMs<sup>*2<\/sup>. The proprietary AI can also warn of approaching objects in the vicinity and detect objects based on arbitrary text. For example, the wearer could ask for \u201cpeople wearing helmets.\u201d&nbsp;<\/p>\n\n\n\n<p><sup>*1<\/sup> egocentric: A view in which the user\u2019s point of view and the virtual camera\u2019s point of view coincide. The user can feel as if they are inside that world.&nbsp;<\/p>\n\n\n\n<p><sup>*2<\/sup> LLMs: Large language models. A technology that can perform processing such as answering questions in natural text or summarizing documents with human-like accuracy, and that can be easily trained. It can process ambiguities and fluctuations that exist in human spoken and written language (natural language) by determining the relationships between words that are far apart in a sentence and taking the context into account.<\/p>\n\n\n\n<figure class=\"wp-block-image size-full\"><img loading=\"lazy\" decoding=\"async\" width=\"637\" height=\"359\" src=\"http:\/\/luminous-horizon.eu\/wp-content\/uploads\/2025\/07\/image-1.jpeg\" alt=\"\" class=\"wp-image-738\" srcset=\"https:\/\/luminous-horizon.eu\/wp-content\/uploads\/2025\/07\/image-1.jpeg 637w, https:\/\/luminous-horizon.eu\/wp-content\/uploads\/2025\/07\/image-1-300x169.jpeg 300w\" sizes=\"auto, (max-width: 637px) 100vw, 637px\" \/><figcaption class=\"wp-element-caption\">Figure 1. Initial design of the first-person 360-degree wearable camera <\/figcaption><\/figure>\n\n\n\n<ul class=\"wp-block-list\">\n<li class=\"has-large-font-size\"><strong>Technical highlights<\/strong>&nbsp;<\/li>\n<\/ul>\n\n\n\n<p class=\"has-medium-font-size\"><strong>First-person 360-degree wearable camera<\/strong>&nbsp;<\/p>\n\n\n\n<p>Ricoh has developed a 360-degree camera that can be worn over the ears (Figure 2). This camera allows for a more immersive experience and is capable of recognition from the point of view of the user.&nbsp;<\/p>\n\n\n\n<p>Overview of first-person 360-degree wearable camera&nbsp;<\/p>\n\n\n\n<p>\u30fb Lightweight (can be worn over the ears)&nbsp;<\/p>\n\n\n\n<p>\u30fb Four cameras capture images 360 degrees around the wearer&nbsp;<\/p>\n\n\n\n<p>\u30fb Inertial measurement units&nbsp;<\/p>\n\n\n\n<p>\u30fb Microphone and speaker functionality&nbsp;<\/p>\n\n\n\n<p>\u30fb Resolution: 3840 \u00d7 1920 (pixels)&nbsp;<\/p>\n\n\n\n<figure class=\"wp-block-image size-full\"><img loading=\"lazy\" decoding=\"async\" width=\"729\" height=\"591\" src=\"http:\/\/luminous-horizon.eu\/wp-content\/uploads\/2025\/07\/image-12.png\" alt=\"\" class=\"wp-image-739\" srcset=\"https:\/\/luminous-horizon.eu\/wp-content\/uploads\/2025\/07\/image-12.png 729w, https:\/\/luminous-horizon.eu\/wp-content\/uploads\/2025\/07\/image-12-300x243.png 300w\" sizes=\"auto, (max-width: 729px) 100vw, 729px\" \/><figcaption class=\"wp-element-caption\">Figure 2.\tFirst-person 360-degree wearable camera  <\/figcaption><\/figure>\n\n\n\n<figure class=\"wp-block-image size-full is-resized\"><img loading=\"lazy\" decoding=\"async\" width=\"286\" height=\"146\" src=\"http:\/\/luminous-horizon.eu\/wp-content\/uploads\/2025\/07\/image-11.png\" alt=\"\" class=\"wp-image-737\" style=\"width:625px;height:auto\"\/><figcaption class=\"wp-element-caption\">Figure 3.\tImage capture sample (first-person 360-degree wearable camera) <\/figcaption><\/figure>\n\n\n\n<p>As shown in Figure 4, the device is controlled by a small single-board PC, making the entire system portable. The device can also communicate in real time with a server, allowing it to connect to LLMs and other state-of-the-art AI models and interact with them in real time via voice.&nbsp;<\/p>\n\n\n\n<figure class=\"wp-block-image size-large\"><img loading=\"lazy\" decoding=\"async\" width=\"1024\" height=\"321\" src=\"http:\/\/luminous-horizon.eu\/wp-content\/uploads\/2025\/07\/image-13-1024x321.png\" alt=\"\" class=\"wp-image-740\" srcset=\"https:\/\/luminous-horizon.eu\/wp-content\/uploads\/2025\/07\/image-13-1024x321.png 1024w, https:\/\/luminous-horizon.eu\/wp-content\/uploads\/2025\/07\/image-13-300x94.png 300w, https:\/\/luminous-horizon.eu\/wp-content\/uploads\/2025\/07\/image-13-768x241.png 768w, https:\/\/luminous-horizon.eu\/wp-content\/uploads\/2025\/07\/image-13.png 1126w\" sizes=\"auto, (max-width: 1024px) 100vw, 1024px\" \/><figcaption class=\"wp-element-caption\">Figure 4.\tOverall system diagram <\/figcaption><\/figure>\n\n\n\n<p class=\"has-medium-font-size\"><strong>360-degree AI<\/strong>&nbsp;<\/p>\n\n\n\n<p class=\"has-medium-font-size\"><span style=\"text-decoration: underline;\">Open-vocabulary object detection<\/span>\u00a0<\/p>\n\n\n\n<p>We have developed a real-time open-vocabulary object detection method for 360-degree video. The user can specify what to detect by freely entering text on the PC, and this exciting function holds great promise in a variety of fields. For example, it could be used to manage safety in a factory, such as entering text to detect \u201cpeople wearing safety vests.\u201d&nbsp;<\/p>\n\n\n\n<figure class=\"wp-block-image size-large\"><img loading=\"lazy\" decoding=\"async\" width=\"1024\" height=\"670\" src=\"http:\/\/luminous-horizon.eu\/wp-content\/uploads\/2025\/07\/image-14-1024x670.png\" alt=\"\" class=\"wp-image-741\" srcset=\"https:\/\/luminous-horizon.eu\/wp-content\/uploads\/2025\/07\/image-14-1024x670.png 1024w, https:\/\/luminous-horizon.eu\/wp-content\/uploads\/2025\/07\/image-14-300x196.png 300w, https:\/\/luminous-horizon.eu\/wp-content\/uploads\/2025\/07\/image-14-768x503.png 768w, https:\/\/luminous-horizon.eu\/wp-content\/uploads\/2025\/07\/image-14.png 1225w\" sizes=\"auto, (max-width: 1024px) 100vw, 1024px\" \/><figcaption class=\"wp-element-caption\">Figure 5.\tOpen-vocabulary object detection <\/figcaption><\/figure>\n\n\n\n<p class=\"has-medium-font-size\"><span style=\"text-decoration: underline;\">Collision detection\u00a0<\/span><\/p>\n\n\n\n<p>We have developed an AI that detects approaching humans in a 360-degree video. \u201cRICOH THETA,\u201d a 360-degree camera, is installed on a forklift and notifies the driver of people approaching the forklift to help prevent unexpected accidents such as colliding with people in the driver\u2019s blind spots.&nbsp;<\/p>\n\n\n\n<figure class=\"wp-block-image size-large\"><img loading=\"lazy\" decoding=\"async\" width=\"1024\" height=\"509\" src=\"http:\/\/luminous-horizon.eu\/wp-content\/uploads\/2025\/07\/image-15-1024x509.png\" alt=\"\" class=\"wp-image-742\" srcset=\"https:\/\/luminous-horizon.eu\/wp-content\/uploads\/2025\/07\/image-15-1024x509.png 1024w, https:\/\/luminous-horizon.eu\/wp-content\/uploads\/2025\/07\/image-15-300x149.png 300w, https:\/\/luminous-horizon.eu\/wp-content\/uploads\/2025\/07\/image-15-768x382.png 768w, https:\/\/luminous-horizon.eu\/wp-content\/uploads\/2025\/07\/image-15.png 1301w\" sizes=\"auto, (max-width: 1024px) 100vw, 1024px\" \/><figcaption class=\"wp-element-caption\">Figure 6. Collision detection <\/figcaption><\/figure>\n","protected":false},"excerpt":{"rendered":"<p>360-Degree Imaging Technology for Next-Generation XR A 360-degree wearable camera and recognition AI that can inform users about their surroundings&nbsp; In situations where extended reality (XR) is used to a high degree, such as daily living assistance, rehabilitation of the disabled, immersive educational and training environments, and in professional and consumer telepresence applications, XR systems [&hellip;]<\/p>\n","protected":false},"author":1,"featured_media":0,"parent":696,"menu_order":7,"comment_status":"closed","ping_status":"closed","template":"","meta":{"footnotes":""},"class_list":["post-736","page","type-page","status-publish","hentry"],"_links":{"self":[{"href":"https:\/\/luminous-horizon.eu\/index.php\/wp-json\/wp\/v2\/pages\/736","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/luminous-horizon.eu\/index.php\/wp-json\/wp\/v2\/pages"}],"about":[{"href":"https:\/\/luminous-horizon.eu\/index.php\/wp-json\/wp\/v2\/types\/page"}],"author":[{"embeddable":true,"href":"https:\/\/luminous-horizon.eu\/index.php\/wp-json\/wp\/v2\/users\/1"}],"replies":[{"embeddable":true,"href":"https:\/\/luminous-horizon.eu\/index.php\/wp-json\/wp\/v2\/comments?post=736"}],"version-history":[{"count":5,"href":"https:\/\/luminous-horizon.eu\/index.php\/wp-json\/wp\/v2\/pages\/736\/revisions"}],"predecessor-version":[{"id":863,"href":"https:\/\/luminous-horizon.eu\/index.php\/wp-json\/wp\/v2\/pages\/736\/revisions\/863"}],"up":[{"embeddable":true,"href":"https:\/\/luminous-horizon.eu\/index.php\/wp-json\/wp\/v2\/pages\/696"}],"wp:attachment":[{"href":"https:\/\/luminous-horizon.eu\/index.php\/wp-json\/wp\/v2\/media?parent=736"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}