<?xml version="1.0" encoding="UTF-8" ?>
< oai_dc:dc schemaLocation =" http://www.openarchives.org/OAI/2.0/oai_dc/ http://www.openarchives.org/OAI/2.0/oai_dc.xsd " >
< dc:title > 3D as‐built environments in extended reality applications: a systematic review </ dc:title >
< dc:creator > Balado Frías, Jesús </ dc:creator >
< dc:creator > Feng, Yu </ dc:creator >
< dc:creator > Qiu , Zhouyan </ dc:creator >
< dc:creator > Gao, Weixiao </ dc:creator >
< dc:creator > Julin, Arttu </ dc:creator >
< dc:subject > 3308 Ingeniería y Tecnología del Medio Ambiente </ dc:subject >
< dc:description > Accurate integration and navigation of real‐world 3D spaces are fundamental for next‐generation Extended Reality (XR) systems, enhancing immersion, utility, and fidelity. This paper systematically reviews XR workflows using PRISMA guidelines, focusing on 3D data acquisition, modeling, visualization, and user interaction, based on 96 journal publications. Data collection for XR relies on photogrammetry, RGB‐D cameras, and LiDAR, often enhanced by multi‐sensor fusion, although real‐time transmission and semantic alignment remain challenging. XR pipelines are dominated by Building Information Modeling (BIM) software and game engines, frequently integrating Computer‐Aided Design (CAD) models and 3D scanned data. Visualization varies from photorealistic renderings to schematic representations, with Virtual Reality headsets favored for training and Augmented Reality devices applied in inspection and navigation. Interaction paradigms encompass controllers, gestures, gaze, voice, and haptics, with increasing reliance on Artificial Intelligence for multimodal fusion and processing. Despite progress, key challenges persist, including bandwidth limitations, manual 3D modeling, hybrid data management, interoperability issues, and scarcity of open‐source solutions. Additional identified barriers involve balancing visual quality with performance in specific contexts, limited accuracy of non‐invasive Brain‐Computer Interfaces, and restricted market acceptance due to high costs. Overall, XR adoption remains constrained by technical, usability, and accessibility gaps. </ dc:description >
< dc:description > Universidade de Vigo/CISUG </ dc:description >
< dc:description > Xunta de Galicia | Ref. EDC431C 2024/30 </ dc:description >
< dc:description > Xunta de Galicia | Ref. ED431F 2024/06 </ dc:description >
< dc:description > Agencia Estatal de Investigación | Ref. RYC2022-038100-I </ dc:description >
< dc:description > Agencia Estatal de Investigación | Ref. PID2021-123475OA-I00 </ dc:description >
< dc:description > Deutsche Forschungsgemeinschaft | Ref. 499168241 </ dc:description >
< dc:description > National Research Council of Finland </ dc:description >
< dc:description > Business Finland | Ref. MIXER (3475/31/2023) </ dc:description >
< dc:date > 2026-04-27T11:00:26Z </ dc:date >
< dc:date > 2026-04-27T11:00:26Z </ dc:date >
< dc:date > 2026-04 </ dc:date >
< dc:date > 2026-04-21T07:45:33Z </ dc:date >
< dc:type > article </ dc:type >
< dc:identifier > The Photogrammetric Record, 41(194): e70046 (2026) </ dc:identifier >
< dc:identifier > 0031868X </ dc:identifier >
< dc:identifier > 14779730 </ dc:identifier >
< dc:identifier > http://hdl.handle.net/11093/11956 </ dc:identifier >
< dc:identifier > 10.1111/phor.70046 </ dc:identifier >
< dc:identifier > https://onlinelibrary.wiley.com/doi/10.1111/phor.70046 </ dc:identifier >
< dc:language > eng </ dc:language >
< dc:relation > info:eu-repo/grantAgreement/AEI/Plan Estatal de Investigación Científica y Técnica y de Innovación 2021-2023/PID2021-123475OA-I00/ES </ dc:relation >
< dc:relation > info:eu-repo/grantAgreement/AEI/Plan Estatal de Investigación Científica y Técnica y de Innovación 2021-2023/RYC2022-038100-I/ES </ dc:relation >
< dc:relation > info:eu-repo/grantAgreement/EC/HE/101129961 </ dc:relation >
< dc:rights > Attribution 4.0 International </ dc:rights >
< dc:rights > https://creativecommons.org/licenses/by/4.0/ </ dc:rights >
< dc:rights > openAccess </ dc:rights >
< dc:publisher > The Photogrammetric Record </ dc:publisher >
< dc:publisher > Enxeñaría dos recursos naturais e medio ambiente </ dc:publisher >
< dc:publisher > Xeotecnoloxías Aplicadas </ dc:publisher >
</ oai_dc:dc >
<?xml version="1.0" encoding="UTF-8" ?>
< d:DIDL schemaLocation =" urn:mpeg:mpeg21:2002:02-DIDL-NS http://standards.iso.org/ittf/PubliclyAvailableStandards/MPEG-21_schema_files/did/didl.xsd " >
< d:DIDLInfo >
< dcterms:created schemaLocation =" http://purl.org/dc/terms/ http://dublincore.org/schemas/xmls/qdc/dcterms.xsd " > 2026-04-27T11:00:26Z </ dcterms:created >
</ d:DIDLInfo >
< d:Item id =" hdl_11093_11956 " >
< d:Descriptor >
< d:Statement mimeType =" application/xml; charset=utf-8 " >
< dii:Identifier schemaLocation =" urn:mpeg:mpeg21:2002:01-DII-NS http://standards.iso.org/ittf/PubliclyAvailableStandards/MPEG-21_schema_files/dii/dii.xsd " > urn:hdl:11093/11956 </ dii:Identifier >
</ d:Statement >
</ d:Descriptor >
< d:Descriptor >
< d:Statement mimeType =" application/xml; charset=utf-8 " >
< oai_dc:dc schemaLocation =" http://www.openarchives.org/OAI/2.0/oai_dc/ http://www.openarchives.org/OAI/2.0/oai_dc.xsd " >
< dc:title > 3D as‐built environments in extended reality applications: a systematic review </ dc:title >
< dc:creator > Balado Frías, Jesús </ dc:creator >
< dc:creator > Feng, Yu </ dc:creator >
< dc:creator > Qiu , Zhouyan </ dc:creator >
< dc:creator > Gao, Weixiao </ dc:creator >
< dc:creator > Julin, Arttu </ dc:creator >
< dc:description > Accurate integration and navigation of real‐world 3D spaces are fundamental for next‐generation Extended Reality (XR) systems, enhancing immersion, utility, and fidelity. This paper systematically reviews XR workflows using PRISMA guidelines, focusing on 3D data acquisition, modeling, visualization, and user interaction, based on 96 journal publications. Data collection for XR relies on photogrammetry, RGB‐D cameras, and LiDAR, often enhanced by multi‐sensor fusion, although real‐time transmission and semantic alignment remain challenging. XR pipelines are dominated by Building Information Modeling (BIM) software and game engines, frequently integrating Computer‐Aided Design (CAD) models and 3D scanned data. Visualization varies from photorealistic renderings to schematic representations, with Virtual Reality headsets favored for training and Augmented Reality devices applied in inspection and navigation. Interaction paradigms encompass controllers, gestures, gaze, voice, and haptics, with increasing reliance on Artificial Intelligence for multimodal fusion and processing. Despite progress, key challenges persist, including bandwidth limitations, manual 3D modeling, hybrid data management, interoperability issues, and scarcity of open‐source solutions. Additional identified barriers involve balancing visual quality with performance in specific contexts, limited accuracy of non‐invasive Brain‐Computer Interfaces, and restricted market acceptance due to high costs. Overall, XR adoption remains constrained by technical, usability, and accessibility gaps. </ dc:description >
< dc:date > 2026-04-27T11:00:26Z </ dc:date >
< dc:date > 2026-04-27T11:00:26Z </ dc:date >
< dc:date > 2026-04 </ dc:date >
< dc:date > 2026-04-21T07:45:33Z </ dc:date >
< dc:type > article </ dc:type >
< dc:identifier > The Photogrammetric Record, 41(194): e70046 (2026) </ dc:identifier >
< dc:identifier > 0031868X </ dc:identifier >
< dc:identifier > 14779730 </ dc:identifier >
< dc:identifier > http://hdl.handle.net/11093/11956 </ dc:identifier >
< dc:identifier > 10.1111/phor.70046 </ dc:identifier >
< dc:identifier > https://onlinelibrary.wiley.com/doi/10.1111/phor.70046 </ dc:identifier >
< dc:language > eng </ dc:language >
< dc:relation > info:eu-repo/grantAgreement/EC/HE/101129961 </ dc:relation >
< dc:relation > info:eu-repo/grantAgreement/AEI/Plan Estatal de Investigación Científica y Técnica y de Innovación 2021-2023/PID2021-123475OA-I00/ES </ dc:relation >
< dc:relation > info:eu-repo/grantAgreement/AEI/Plan Estatal de Investigación Científica y Técnica y de Innovación 2021-2023/RYC2022-038100-I/ES </ dc:relation >
< dc:rights > https://creativecommons.org/licenses/by/4.0/ </ dc:rights >
< dc:rights > openAccess </ dc:rights >
< dc:rights > Attribution 4.0 International </ dc:rights >
< dc:publisher > The Photogrammetric Record </ dc:publisher >
< dc:publisher > Enxeñaría dos recursos naturais e medio ambiente </ dc:publisher >
< dc:publisher > Xeotecnoloxías Aplicadas </ dc:publisher >
</ oai_dc:dc >
</ d:Statement >
</ d:Descriptor >
< d:Component id =" 11093_11956_4 " >
</ d:Component >
</ d:Item >
</ d:DIDL >
<?xml version="1.0" encoding="UTF-8" ?>
< dim:dim schemaLocation =" http://www.dspace.org/xmlns/dspace/dim http://www.dspace.org/schema/dim.xsd " >
< dim:field authority =" 8101 " confidence =" 600 " element =" contributor " mdschema =" dc " qualifier =" author " > Balado Frías, Jesús </ dim:field >
< dim:field authority =" 6735cfa0-7f0d-48fd-aa91-e796508b0947 " confidence =" 600 " element =" contributor " mdschema =" dc " qualifier =" author " > Feng, Yu </ dim:field >
< dim:field authority =" 8659 " confidence =" 600 " element =" contributor " mdschema =" dc " qualifier =" author " > Qiu , Zhouyan </ dim:field >
< dim:field authority =" 203da513-26f3-40e4-8866-947b1c37e629 " confidence =" 600 " element =" contributor " mdschema =" dc " qualifier =" author " > Gao, Weixiao </ dim:field >
< dim:field authority =" b8444a62-585d-4b18-94ed-b2cfb29cad72 " confidence =" 600 " element =" contributor " mdschema =" dc " qualifier =" author " > Julin, Arttu </ dim:field >
< dim:field element =" date " mdschema =" dc " qualifier =" accessioned " > 2026-04-27T11:00:26Z </ dim:field >
< dim:field element =" date " mdschema =" dc " qualifier =" available " > 2026-04-27T11:00:26Z </ dim:field >
< dim:field element =" date " mdschema =" dc " qualifier =" issued " > 2026-04 </ dim:field >
< dim:field element =" date " mdschema =" dc " qualifier =" updated " > 2026-04-21T07:45:33Z </ dim:field >
< dim:field element =" identifier " lang =" spa " mdschema =" dc " qualifier =" citation " > The Photogrammetric Record, 41(194): e70046 (2026) </ dim:field >
< dim:field element =" identifier " mdschema =" dc " qualifier =" issn " > 0031868X </ dim:field >
< dim:field element =" identifier " mdschema =" dc " qualifier =" issn " > 14779730 </ dim:field >
< dim:field element =" identifier " mdschema =" dc " qualifier =" uri " > http://hdl.handle.net/11093/11956 </ dim:field >
< dim:field element =" identifier " mdschema =" dc " qualifier =" doi " > 10.1111/phor.70046 </ dim:field >
< dim:field element =" identifier " lang =" spa " mdschema =" dc " qualifier =" editor " > https://onlinelibrary.wiley.com/doi/10.1111/phor.70046 </ dim:field >
< dim:field element =" description " lang =" en " mdschema =" dc " qualifier =" abstract " > Accurate integration and navigation of real‐world 3D spaces are fundamental for next‐generation Extended Reality (XR) systems, enhancing immersion, utility, and fidelity. This paper systematically reviews XR workflows using PRISMA guidelines, focusing on 3D data acquisition, modeling, visualization, and user interaction, based on 96 journal publications. Data collection for XR relies on photogrammetry, RGB‐D cameras, and LiDAR, often enhanced by multi‐sensor fusion, although real‐time transmission and semantic alignment remain challenging. XR pipelines are dominated by Building Information Modeling (BIM) software and game engines, frequently integrating Computer‐Aided Design (CAD) models and 3D scanned data. Visualization varies from photorealistic renderings to schematic representations, with Virtual Reality headsets favored for training and Augmented Reality devices applied in inspection and navigation. Interaction paradigms encompass controllers, gestures, gaze, voice, and haptics, with increasing reliance on Artificial Intelligence for multimodal fusion and processing. Despite progress, key challenges persist, including bandwidth limitations, manual 3D modeling, hybrid data management, interoperability issues, and scarcity of open‐source solutions. Additional identified barriers involve balancing visual quality with performance in specific contexts, limited accuracy of non‐invasive Brain‐Computer Interfaces, and restricted market acceptance due to high costs. Overall, XR adoption remains constrained by technical, usability, and accessibility gaps. </ dim:field >
< dim:field element =" description " lang =" spa " mdschema =" dc " qualifier =" sponsorship " > Universidade de Vigo/CISUG </ dim:field >
< dim:field element =" description " lang =" spa " mdschema =" dc " qualifier =" sponsorship " > Xunta de Galicia | Ref. EDC431C 2024/30 </ dim:field >
< dim:field element =" description " lang =" spa " mdschema =" dc " qualifier =" sponsorship " > Xunta de Galicia | Ref. ED431F 2024/06 </ dim:field >
< dim:field element =" description " lang =" spa " mdschema =" dc " qualifier =" sponsorship " > Agencia Estatal de Investigación | Ref. RYC2022-038100-I </ dim:field >
< dim:field element =" description " lang =" spa " mdschema =" dc " qualifier =" sponsorship " > Agencia Estatal de Investigación | Ref. PID2021-123475OA-I00 </ dim:field >
< dim:field element =" description " lang =" spa " mdschema =" dc " qualifier =" sponsorship " > Deutsche Forschungsgemeinschaft | Ref. 499168241 </ dim:field >
< dim:field element =" description " lang =" spa " mdschema =" dc " qualifier =" sponsorship " > National Research Council of Finland </ dim:field >
< dim:field element =" description " lang =" spa " mdschema =" dc " qualifier =" sponsorship " > Business Finland | Ref. MIXER (3475/31/2023) </ dim:field >
< dim:field element =" language " lang =" spa " mdschema =" dc " qualifier =" iso " > eng </ dim:field >
< dim:field element =" publisher " lang =" spa " mdschema =" dc " > The Photogrammetric Record </ dim:field >
< dim:field element =" publisher " lang =" spa " mdschema =" dc " qualifier =" departamento " > Enxeñaría dos recursos naturais e medio ambiente </ dim:field >
< dim:field element =" publisher " lang =" spa " mdschema =" dc " qualifier =" grupoinvestigacion " > Xeotecnoloxías Aplicadas </ dim:field >
< dim:field element =" relation " mdschema =" dc " > info:eu-repo/grantAgreement/AEI/Plan Estatal de Investigación Científica y Técnica y de Innovación 2021-2023/PID2021-123475OA-I00/ES </ dim:field >
< dim:field element =" relation " mdschema =" dc " > info:eu-repo/grantAgreement/AEI/Plan Estatal de Investigación Científica y Técnica y de Innovación 2021-2023/RYC2022-038100-I/ES </ dim:field >
< dim:field element =" relation " lang =" spa " mdschema =" dc " qualifier =" projectID " > info:eu-repo/grantAgreement/EC/HE/101129961 </ dim:field >
< dim:field element =" rights " mdschema =" dc " > Attribution 4.0 International </ dim:field >
< dim:field element =" rights " mdschema =" dc " qualifier =" uri " > https://creativecommons.org/licenses/by/4.0/ </ dim:field >
< dim:field element =" rights " lang =" spa " mdschema =" dc " qualifier =" accessRights " > openAccess </ dim:field >
< dim:field element =" title " lang =" en " mdschema =" dc " > 3D as‐built environments in extended reality applications: a systematic review </ dim:field >
< dim:field element =" type " lang =" spa " mdschema =" dc " > article </ dim:field >
< dim:field element =" subject " lang =" spa " mdschema =" dc " qualifier =" unesco " > 3308 Ingeniería y Tecnología del Medio Ambiente </ dim:field >
< dim:field element =" computerCitation " lang =" spa " mdschema =" dc " > pub_title=The Photogrammetric Record|volume=41|journal_number=194|start_pag=e70046|end_pag= </ dim:field >
</ dim:dim >
<?xml version="1.0" encoding="UTF-8" ?>
< thesis schemaLocation =" http://www.ndltd.org/standards/metadata/etdms/1.0/ http://www.ndltd.org/standards/metadata/etdms/1.0/etdms.xsd " >
< title > 3D as‐built environments in extended reality applications: a systematic review </ title >
< creator > Balado Frías, Jesús </ creator >
< creator > Feng, Yu </ creator >
< creator > Qiu , Zhouyan </ creator >
< creator > Gao, Weixiao </ creator >
< creator > Julin, Arttu </ creator >
< description > Accurate integration and navigation of real‐world 3D spaces are fundamental for next‐generation Extended Reality (XR) systems, enhancing immersion, utility, and fidelity. This paper systematically reviews XR workflows using PRISMA guidelines, focusing on 3D data acquisition, modeling, visualization, and user interaction, based on 96 journal publications. Data collection for XR relies on photogrammetry, RGB‐D cameras, and LiDAR, often enhanced by multi‐sensor fusion, although real‐time transmission and semantic alignment remain challenging. XR pipelines are dominated by Building Information Modeling (BIM) software and game engines, frequently integrating Computer‐Aided Design (CAD) models and 3D scanned data. Visualization varies from photorealistic renderings to schematic representations, with Virtual Reality headsets favored for training and Augmented Reality devices applied in inspection and navigation. Interaction paradigms encompass controllers, gestures, gaze, voice, and haptics, with increasing reliance on Artificial Intelligence for multimodal fusion and processing. Despite progress, key challenges persist, including bandwidth limitations, manual 3D modeling, hybrid data management, interoperability issues, and scarcity of open‐source solutions. Additional identified barriers involve balancing visual quality with performance in specific contexts, limited accuracy of non‐invasive Brain‐Computer Interfaces, and restricted market acceptance due to high costs. Overall, XR adoption remains constrained by technical, usability, and accessibility gaps. </ description >
< date > 2026-04-27 </ date >
< date > 2026-04-27 </ date >
< date > 2026-04 </ date >
< date > 2026-04-21 </ date >
< type > article </ type >
< identifier > The Photogrammetric Record, 41(194): e70046 (2026) </ identifier >
< identifier > 0031868X </ identifier >
< identifier > 14779730 </ identifier >
< identifier > http://hdl.handle.net/11093/11956 </ identifier >
< identifier > 10.1111/phor.70046 </ identifier >
< identifier > https://onlinelibrary.wiley.com/doi/10.1111/phor.70046 </ identifier >
< language > eng </ language >
< relation > info:eu-repo/grantAgreement/EC/HE/101129961 </ relation >
< relation > info:eu-repo/grantAgreement/AEI/Plan Estatal de Investigación Científica y Técnica y de Innovación 2021-2023/PID2021-123475OA-I00/ES </ relation >
< relation > info:eu-repo/grantAgreement/AEI/Plan Estatal de Investigación Científica y Técnica y de Innovación 2021-2023/RYC2022-038100-I/ES </ relation >
< rights > https://creativecommons.org/licenses/by/4.0/ </ rights >
< rights > openAccess </ rights >
< rights > Attribution 4.0 International </ rights >
< publisher > The Photogrammetric Record </ publisher >
< publisher > Enxeñaría dos recursos naturais e medio ambiente </ publisher >
< publisher > Xeotecnoloxías Aplicadas </ publisher >
</ thesis >
<?xml version="1.0" encoding="UTF-8" ?>
< record schemaLocation =" http://www.loc.gov/MARC21/slim http://www.loc.gov/standards/marcxml/schema/MARC21slim.xsd " >
< leader > 00925njm 22002777a 4500 </ leader >
< datafield ind1 =" " ind2 =" " tag =" 042 " >
< subfield code =" a " > dc </ subfield >
</ datafield >
< datafield ind1 =" " ind2 =" " tag =" 720 " >
< subfield code =" a " > Balado Frías, Jesús </ subfield >
< subfield code =" e " > author </ subfield >
</ datafield >
< datafield ind1 =" " ind2 =" " tag =" 720 " >
< subfield code =" a " > Feng, Yu </ subfield >
< subfield code =" e " > author </ subfield >
</ datafield >
< datafield ind1 =" " ind2 =" " tag =" 720 " >
< subfield code =" a " > Qiu , Zhouyan </ subfield >
< subfield code =" e " > author </ subfield >
</ datafield >
< datafield ind1 =" " ind2 =" " tag =" 720 " >
< subfield code =" a " > Gao, Weixiao </ subfield >
< subfield code =" e " > author </ subfield >
</ datafield >
< datafield ind1 =" " ind2 =" " tag =" 720 " >
< subfield code =" a " > Julin, Arttu </ subfield >
< subfield code =" e " > author </ subfield >
</ datafield >
< datafield ind1 =" " ind2 =" " tag =" 260 " >
< subfield code =" c " > 2026-04 </ subfield >
</ datafield >
< datafield ind1 =" " ind2 =" " tag =" 520 " >
< subfield code =" a " > Accurate integration and navigation of real‐world 3D spaces are fundamental for next‐generation Extended Reality (XR) systems, enhancing immersion, utility, and fidelity. This paper systematically reviews XR workflows using PRISMA guidelines, focusing on 3D data acquisition, modeling, visualization, and user interaction, based on 96 journal publications. Data collection for XR relies on photogrammetry, RGB‐D cameras, and LiDAR, often enhanced by multi‐sensor fusion, although real‐time transmission and semantic alignment remain challenging. XR pipelines are dominated by Building Information Modeling (BIM) software and game engines, frequently integrating Computer‐Aided Design (CAD) models and 3D scanned data. Visualization varies from photorealistic renderings to schematic representations, with Virtual Reality headsets favored for training and Augmented Reality devices applied in inspection and navigation. Interaction paradigms encompass controllers, gestures, gaze, voice, and haptics, with increasing reliance on Artificial Intelligence for multimodal fusion and processing. Despite progress, key challenges persist, including bandwidth limitations, manual 3D modeling, hybrid data management, interoperability issues, and scarcity of open‐source solutions. Additional identified barriers involve balancing visual quality with performance in specific contexts, limited accuracy of non‐invasive Brain‐Computer Interfaces, and restricted market acceptance due to high costs. Overall, XR adoption remains constrained by technical, usability, and accessibility gaps. </ subfield >
</ datafield >
< datafield ind1 =" 8 " ind2 =" " tag =" 024 " >
< subfield code =" a " > The Photogrammetric Record, 41(194): e70046 (2026) </ subfield >
</ datafield >
< datafield ind1 =" 8 " ind2 =" " tag =" 024 " >
< subfield code =" a " > 0031868X </ subfield >
</ datafield >
< datafield ind1 =" 8 " ind2 =" " tag =" 024 " >
< subfield code =" a " > 14779730 </ subfield >
</ datafield >
< datafield ind1 =" 8 " ind2 =" " tag =" 024 " >
< subfield code =" a " > http://hdl.handle.net/11093/11956 </ subfield >
</ datafield >
< datafield ind1 =" 8 " ind2 =" " tag =" 024 " >
< subfield code =" a " > 10.1111/phor.70046 </ subfield >
</ datafield >
< datafield ind1 =" 8 " ind2 =" " tag =" 024 " >
< subfield code =" a " > https://onlinelibrary.wiley.com/doi/10.1111/phor.70046 </ subfield >
</ datafield >
< datafield ind1 =" 0 " ind2 =" 0 " tag =" 245 " >
< subfield code =" a " > 3D as‐built environments in extended reality applications: a systematic review </ subfield >
</ datafield >
</ record >
<?xml version="1.0" encoding="UTF-8" ?>
< mets ID =" DSpace_ITEM_11093-11956 " OBJID =" hdl:11093/11956 " PROFILE =" DSpace METS SIP Profile 1.0 " TYPE =" DSpace ITEM " schemaLocation =" http://www.loc.gov/METS/ http://www.loc.gov/standards/mets/mets.xsd " >
< metsHdr CREATEDATE =" 2026-04-28T03:33:55Z " >
< agent ROLE =" CUSTODIAN " TYPE =" ORGANIZATION " >
< name > Investigo </ name >
</ agent >
</ metsHdr >
< dmdSec ID =" DMD_11093_11956 " >
< mdWrap MDTYPE =" MODS " >
< xmlData schemaLocation =" http://www.loc.gov/mods/v3 http://www.loc.gov/standards/mods/v3/mods-3-1.xsd " >
< mods:mods schemaLocation =" http://www.loc.gov/mods/v3 http://www.loc.gov/standards/mods/v3/mods-3-1.xsd " >
< mods:name >
< mods:role >
< mods:roleTerm type =" text " > author </ mods:roleTerm >
</ mods:role >
< mods:namePart > Balado Frías, Jesús </ mods:namePart >
</ mods:name >
< mods:name >
< mods:role >
< mods:roleTerm type =" text " > author </ mods:roleTerm >
</ mods:role >
< mods:namePart > Feng, Yu </ mods:namePart >
</ mods:name >
< mods:name >
< mods:role >
< mods:roleTerm type =" text " > author </ mods:roleTerm >
</ mods:role >
< mods:namePart > Qiu , Zhouyan </ mods:namePart >
</ mods:name >
< mods:name >
< mods:role >
< mods:roleTerm type =" text " > author </ mods:roleTerm >
</ mods:role >
< mods:namePart > Gao, Weixiao </ mods:namePart >
</ mods:name >
< mods:name >
< mods:role >
< mods:roleTerm type =" text " > author </ mods:roleTerm >
</ mods:role >
< mods:namePart > Julin, Arttu </ mods:namePart >
</ mods:name >
< mods:extension >
< mods:dateAccessioned encoding =" iso8601 " > 2026-04-27T11:00:26Z </ mods:dateAccessioned >
</ mods:extension >
< mods:extension >
< mods:dateAvailable encoding =" iso8601 " > 2026-04-27T11:00:26Z </ mods:dateAvailable >
</ mods:extension >
< mods:originInfo >
< mods:dateIssued encoding =" iso8601 " > 2026-04 </ mods:dateIssued >
</ mods:originInfo >
< mods:identifier type =" citation " > The Photogrammetric Record, 41(194): e70046 (2026) </ mods:identifier >
< mods:identifier type =" issn " > 0031868X14779730 </ mods:identifier >
< mods:identifier type =" uri " > http://hdl.handle.net/11093/11956 </ mods:identifier >
< mods:identifier type =" doi " > 10.1111/phor.70046 </ mods:identifier >
< mods:identifier type =" editor " > https://onlinelibrary.wiley.com/doi/10.1111/phor.70046 </ mods:identifier >
< mods:abstract > Accurate integration and navigation of real‐world 3D spaces are fundamental for next‐generation Extended Reality (XR) systems, enhancing immersion, utility, and fidelity. This paper systematically reviews XR workflows using PRISMA guidelines, focusing on 3D data acquisition, modeling, visualization, and user interaction, based on 96 journal publications. Data collection for XR relies on photogrammetry, RGB‐D cameras, and LiDAR, often enhanced by multi‐sensor fusion, although real‐time transmission and semantic alignment remain challenging. XR pipelines are dominated by Building Information Modeling (BIM) software and game engines, frequently integrating Computer‐Aided Design (CAD) models and 3D scanned data. Visualization varies from photorealistic renderings to schematic representations, with Virtual Reality headsets favored for training and Augmented Reality devices applied in inspection and navigation. Interaction paradigms encompass controllers, gestures, gaze, voice, and haptics, with increasing reliance on Artificial Intelligence for multimodal fusion and processing. Despite progress, key challenges persist, including bandwidth limitations, manual 3D modeling, hybrid data management, interoperability issues, and scarcity of open‐source solutions. Additional identified barriers involve balancing visual quality with performance in specific contexts, limited accuracy of non‐invasive Brain‐Computer Interfaces, and restricted market acceptance due to high costs. Overall, XR adoption remains constrained by technical, usability, and accessibility gaps. </ mods:abstract >
< mods:language >
< mods:languageTerm authority =" rfc3066 " > eng </ mods:languageTerm >
</ mods:language >
< mods:accessCondition type =" useAndReproduction " > Attribution 4.0 International </ mods:accessCondition >
< mods:titleInfo >
< mods:title > 3D as‐built environments in extended reality applications: a systematic review </ mods:title >
</ mods:titleInfo >
< mods:genre > article </ mods:genre >
</ mods:mods >
</ xmlData >
</ mdWrap >
</ dmdSec >
< amdSec ID =" TMD_11093_11956 " >
< rightsMD ID =" RIG_11093_11956 " >
< mdWrap MDTYPE =" OTHER " MIMETYPE =" text/plain " OTHERMDTYPE =" DSpaceDepositLicense " >
< binData > TElDRU5DSUEgSU5WRVNUSUdPCgpDb24gb2JqZXRvIGRlIHF1ZSBJbnZlc3RpZ28gcHVlZGEgYXJjaGl2YXIgeSBkaWZ1bmRpciBsb3MgZG9jdW1lbnRvcyBxdWUgc2UgZGVwb3NpdGFuIGVuIMOpbCwgc2UgbmVjZXNpdGEgbGEgYXV0b3JpemFjacOzbiBkZSBsYS9zIGF1dG9yYS9zIG8gYXV0b3IvZXMgZGUgbG9zIGRvY3VtZW50b3MgbWVkaWFudGUgbGEgcHJlc2VudGUgbGljZW5jaWEgZGUgZGlzdHJpYnVjacOzbiBubyBleGNsdXNpdmEgKOKAnExpY2VuY2lhIEludmVzdGlnb+KAnSkuIAoKQWwgb3RvcmdhciBlc3RhIGxpY2VuY2lhLCBsYS9zIGF1dG9yYS9zIG8gYXV0b3IvZXMgZGUgbG9zIGRvY3VtZW50b3MgbWFudGllbmVuIGVuIHN1IHBvZGVyIGxhIHRvdGFsaWRhZCBkZSBsb3MgZGVyZWNob3MgZGUgYXV0b3IsIHB1ZGllbmRvLCBwb3IgdGFudG8sIGhhY2VyIHVzbyBkZWwgdHJhYmFqbyBkZXBvc2l0YWRvIGVuIGxhIGZvcm1hIGVuIHF1ZSBlc3RpbWVuIG9wb3J0dW5vIChkZXBvc2l0YXJsbyBlbiBvdHJvcyByZXBvc2l0b3Jpb3MsIHB1YmxpY2FybG8gZW4gbWVkaW9zIGNvbWVyY2lhbGVzLCBkaWZ1bmRpw6luZG9sbyBlbiBzdSBww6FnaW5hIHdlYiwgZXRjLikuIAoKUG9yIGZhdm9yLCBsZWEgYXRlbnRhbWVudGUgbG9zIHTDqXJtaW5vcyBxdWUgYSBjb250aW51YWNpw7NuIHNlIHNlw7FhbGFuLCBlbiBsb3MgY3VhbGVzIHVzdGVkIHBlcm1pdGUgbyBhdXRvcml6YSBlbCBkZXDDs3NpdG8geSBkaWZ1c2nDs24gZGUgc3UgZG9jdW1lbnRvIGVuIEludmVzdGlnbzoKCkVuIHN1IGNvbmRpY2nDs24gZGUgYXV0b3IvYSBvIHByb3BpZXRhcmlvL2EgZGUgbG9zIGRlcmVjaG9zIGRlIGF1dG9yLCB1c3RlZDoKCjEuLSBPdG9yZ2EgYSBsYSBVbml2ZXJzaWRhZGUgZGUgVmlnbyBlbCBkZXJlY2hvIG5vIGV4Y2x1c2l2byBhIGFyY2hpdmFyLCByZXByb2R1Y2lyLCBjb252ZXJ0aXIgZW4gbGEgZm9ybWEgcXVlIG3DoXMgYWJham8gc2UgZGVzY3JpYmUsIGNvbXVuaWNhciBvIGRpc3RyaWJ1aXIgdW5pdmVyc2FsbWVudGUgZWwgZG9jdW1lbnRvIGVuIGZvcm1hdG8gZWxlY3Ryw7NuaWNvOwoKMi4tIEF1dG9yaXphIGEgbGEgVW5pdmVyc2lkYWRlIGRlIFZpZ28gYSBjb25zZXJ2YXIgbcOhcyBkZSB1bmEgY29waWEgZGUgc3UgZG9jdW1lbnRvIHkgYSBxdWUsIHNpbiBhbHRlcmFyIHN1IGNvbnRlbmlkbywgbG8gcHVlZGEgY29udmVydGlyIGEgY3VhbHF1aWVyIG90cm8gZm9ybWF0byBkZSBmaWNoZXJvLCBtZWRpbyBvIHNvcG9ydGUsIGNvbiBwcm9ww7NzaXRvcyBkZSBzZWd1cmlkYWQsIHByZXNlcnZhY2nDs24geSBhY2Nlc287IAoKMy4tIE1hbmlmaWVzdGEgcXVlIGVsIGRvY3VtZW50byBkZXBvc2l0YWRvIGVzIHVuIHRyYWJham8gb3JpZ2luYWwgc3V5byB5IHF1ZSBlc3TDoSBsZWdpdGltYWRvIHBhcmEgb3RvcmdhciBsb3MgZGVyZWNob3MgY29udGVuaWRvcyBlbiBsYSBwcmVzZW50ZSBsaWNlbmNpYSBkZSBkaXN0cmlidWNpw7NuLiBEZSBsYSBtaXNtYSBmb3JtYSBkZWNsYXJhIHF1ZSwgZW4gbGEgbWVkaWRhIGRlIGxvIHF1ZSBsZSByZXN1bHRhIHBvc2libGUgY29ub2Nlciwgc3UgZG9jdW1lbnRvIG5vIGluZnJpbmdlIGxvcyBkZXJlY2hvcyBkZSBhdXRvciwgZGUgbmluZ3VuYSBvdHJhIHBlcnNvbmEgbyBlbnRpZGFkOwoKNC4tIEFmaXJtYSBxdWUsIGVuIGVsIGNhc28gZGUgcXVlIHNlIHRyYXRlIGRlIHVuYSBvYnJhIGNvbiBtw6FzIGRlIHVuYSBhdXRvcsOtYSwgbGEgZGVwb3NpdGEgZW4gbm9tYnJlIHkgY29uIGVsIGNvbnNlbnRpbWllbnRvIGRlbCByZXN0byBkZSBjb2F1dG9yZXMgZSBjb2F1dG9yYXM7IAoKNS4tIERlY2xhcmEgcXVlLCBlbiBlbCBjYXNvIGRlIHF1ZSBlbCBkb2N1bWVudG8gY29udGVuZ2EgbWF0ZXJpYWwgZGVsIHF1ZSBubyBwb3NlZSBsb3MgZGVyZWNob3MgZGUgYXV0b3IsIGhhIG9idGVuaWRvIGVsIHBlcm1pc28gZGUgbGEgcGVyc29uYSBwcm9waWV0YXJpYSBkZSB0YWxlcyBkZXJlY2hvcyBwYXJhIG90b3JnYXIgYSBsYSBVbml2ZXJzaWRhZGUgZGUgVmlnbyBsb3MgZGVyZWNob3MgcmVxdWVyaWRvcyBwb3IgZXN0YSBsaWNlbmNpYSwgYXPDrSBjb21vIHF1ZSBlc2UgbWF0ZXJpYWwgY3V5b3MgZGVyZWNob3MgY29ycmVzcG9uZGVuIGEgdGVyY2VyYXMgcGVyc29uYXMgZXN0w6EgY2xhcmFtZW50ZSBpZGVudGlmaWNhZG8geSByZWNvbm9jaWRvIGVuIGVsIHRleHRvIG8gY29udGVuaWRvIGRlbCBkb2N1bWVudG8gZGVwb3NpdGFkbzsgCgo2Li0gUmVjb25vY2UgcXVlIHNpIGVsIGRvY3VtZW50byBzZSBiYXNhIGVuIHRyYWJham9zIHBhdHJvY2luYWRvcyBvIGZpbmFuY2lhZG9zIHBvciB1bmEgb3JnYW5pemFjacOzbiBvIGluc3RpdHVjacOzbiBkaWZlcmVudGUgZGUgbGEgVW5pdmVyc2lkYWRlIGRlIFZpZ28sIGhhIGN1bXBsaWRvIGNvbiBjdWFscXVpZXIgZGVyZWNobyB1IG9ibGlnYWNpw7NuIGVzdGFibGVjaWRhIHBvciBlbCBjb3JyZXNwb25kaWVudGUgY29udHJhdG8gbyBhY3VlcmRvIGNvbiBkaWNoYSBvcmdhbml6YWNpw7NuLgoKCkVuIHZpcnR1ZCBkZSBsYSBwcmVzZW50ZSBsaWNlbmNpYSwgbGEgVW5pdmVyc2lkYWRlIGRlIFZpZ28gc2UgY29tcHJvbWV0ZSBhIGlkZW50aWZpY2FyIGNsYXJhbWVudGUgZWwgbm9tYnJlIGRlIGxhL3MgYXV0b3JhL3MgbyBhdXRvci9lcyBjb21vIHByb3BpZXRhcmlhcy9vcyBkZSBsb3MgZGVyZWNob3MgZGVsIGRvY3VtZW50byBkZXBvc2l0YWRvLCBzaW4gcmVhbGl6YXIgYWx0ZXJhY2nDs24gYWxndW5hIGRlbCBtaXNtbyBleGNlcHRvIGxhcyBwZXJtaXRpZGFzIHBvciBlc3RhIGxpY2VuY2lhLgoKCkxJQ0VOWkEgSU5WRVNUSUdPCgpDbyBvYnhlY3RvIGRlIHF1ZSBJbnZlc3RpZ28gcG9pZGEgYXJxdWl2YXIgZSBkaWZ1bmRpciBvcyBkb2N1bWVudG9zIHF1ZSBzZSBkZXBvc2l0YW4gbmVsLCBuZWNlc8OtdGFzZSBhIGF1dG9yaXphY2nDs24gZGFzIGF1dG9yYXMgb3UgYXV0b3JlcyBkb3MgZG9jdW1lbnRvcyBtZWRpYW50ZSBhIHByZXNlbnRlIGxpY2VuemEgZGUgZGlzdHJpYnVjacOzbiBub24gZXhjbHVzaXZhICjCq0xpY2VuemEgSW52ZXN0aWdvwrspLgoKQW8gb3V0b3JnYXIgZXN0YSBsaWNlbnphLCBhcyBhdXRvcmFzIG91IGF1dG9yZXMgZG9zIGRvY3VtZW50b3MgbWFudGXDsWVuIG5vIHNldSBwb2RlciBhIHRvdGFsaWRhZGUgZG9zIGRlcmVpdG9zIGRlIGF1dG9yIGUgcG9kZW4sIHBvbG8gdGFudG8sIGZhY2VyIHVzbyBkbyB0cmFiYWxsbyBkZXBvc2l0YWRvIG5hIGZvcm1hIGVuIHF1ZSBlc3RpbWUgb3BvcnR1bm8gKGRlcG9zaXRhbG8gbm91dHJvcyByZXBvc2l0b3Jpb3MsIHB1YmxpY2FsbyBlbiBtZWRpb3MgY29tZXJjaWFpcywgZGlmdW5kaWxvIGEgdHJhdsOpcyBkYSBzw7phIHDDoXhpbmEgd2ViLCBldGMuKS4KClBvciBmYXZvciwgbGVhIGF0ZW50YW1lbnRlIG9zIHRlcm1vcyBxdWUgYSBjb250aW51YWNpw7NuIHNlIHNpbmFsYW4sIG5vcyBjYWxlcyB2b3N0ZWRlIHBlcm1pdGUgb3UgYXV0b3JpemEgbyBkZXDDs3NpdG8gZSBkaWZ1c2nDs24gZG8gc2V1IGRvY3VtZW50byBlbiBJbnZlc3RpZ286CgpOYSBzw7phIGNvbmRpY2nDs24gZGUgYXV0b3IvYXV0b3JhIG91IHByb3BpZXRhcmlhL3Byb3BpZXRhcmlvIGRvcyBkZXJlaXRvcyBkZSBhdXRvciwgdm9zdGVkZToKCjEuLSBPdXTDs3JnYWxsZSDDoSBVbml2ZXJzaWRhZGUgZGUgVmlnbyBvIGRlcmVpdG8gbm9uIGV4Y2x1c2l2byBhIGFycXVpdmFyLCByZXByb2R1Y2lyLCBjb252ZXJ0ZXIgbmEgZm9ybWEgcXVlIG3DoWlzIGFiYWl4byBzZSBkZXNjcmliZSwgY29tdW5pY2FyIG91IGRpc3RyaWJ1w61yIHVuaXZlcnNhbG1lbnRlIG8gZG9jdW1lbnRvIGVuIGZvcm1hdG8gZWxlY3Ryw7NuaWNvLgoKMi4tIEF1dG9yw616YWxsZSDDoSBVbml2ZXJzaWRhZGUgZGUgVmlnbyBhIGNvbnNlcnZhciBtw6FpcyBkdW5oYSBjb3BpYSBkbyBzZXUgZG9jdW1lbnRvIGUgYSBxdWUsIHNlbiBhbHRlcmFyIG8gc2V1IGNvbnRpZG8sIG8gcG9pZGEgY29udmVydGVyIGEgY2FscXVlcmEgb3V0cm8gZm9ybWF0byBkZSBmaWNoZWlybywgbWVkaW8gb3Ugc29wb3J0ZSwgY29uIHByb3DDs3NpdG9zIGRlIHNlZ3VyaWRhZGUsIHByZXNlcnZhY2nDs24gZSBhY2Nlc28uCgozLi0gTWFuaWZlc3RhIHF1ZSBvIGRvY3VtZW50byBkZXBvc2l0YWRvIMOpIHVuIHRyYWJhbGxvIG9yaXhpbmFsIHByb3BpbyBlIHF1ZSBlc3TDoSBsZXhpdGltYWRvIHBhcmEgb3V0b3JnYXIgb3MgZGVyZWl0b3MgY29udGlkb3MgbmEgcHJlc2VudGUgbGljZW56YSBkZSBkaXN0cmlidWNpw7NuLiBEYSBtZXNtYSBmb3JtYSBkZWNsYXJhIHF1ZSwgbmEgbWVkaWRhIGRvIHF1ZSBsbGUgcmVzdWx0YSBwb3NpYmxlIGNvw7FlY2VyLCBvIHNldSBkb2N1bWVudG8gbm9uIGluZnJpbnhlIG9zIGRlcmVpdG9zIGRlIGF1dG9yLCBkZSBuaW5ndW5oYSBvdXRyYSBwZXJzb2Egb3UgZW50aWRhZGUuCgo0Li0gQWZpcm1hIHF1ZSwgbm8gY2FzbyBkZSBxdWUgc2UgdHJhdGUgZHVuaGEgb2JyYSBjb24gbcOhaXMgZHVuaGEgYXV0b3LDrWEsIGRlcG9zw610YWEgZW4gbm9tZSBlIGNvIGNvbnNlbnRpbWVudG8gZG8gcmVzdG8gZGUgY29hdXRvcmVzIGUgY29hdXRvcmFzLgoKNS4tIERlY2xhcmEgcXVlLCBubyBjYXNvIGRlIHF1ZSBvIGRvY3VtZW50byBjb250ZcOxYSBtYXRlcmlhbCBkbyBxdWUgbm9uIHBvc8O6ZSBvcyBkZXJlaXRvcyBkZSBhdXRvciwgb2J0aXZvIG8gcGVybWlzbyBkYSBwZXJzb2EgcHJvcGlldGFyaWEgZGUgdGFsZXMgZGVyZWl0b3MgcGFyYSBvdXRvcmdhcmxsZSDDoSBVbml2ZXJzaWRhZGUgZGUgVmlnbyBvcyBkZXJlaXRvcyByZXF1aXJpZG9zIHBvciBlc3RhIGxpY2VuemEsIGFzw60gY29tbyBxdWUgZXNlIG1hdGVyaWFsIGN1eG9zIGRlcmVpdG9zIGNvcnJlc3BvbmRlbiBhIHRlcmNlaXJhcyBwZXJzb2FzIGVzdMOhIGNsYXJhbWVudGUgaWRlbnRpZmljYWRvIGUgcmVjb8OxZWNpZG8gbm8gdGV4dG8gb3UgY29udGlkbyBkbyBkb2N1bWVudG8gZGVwb3NpdGFkby4KCjYuLSBSZWNvw7FlY2UgcXVlIHNlIG8gZG9jdW1lbnRvIHNlIGJhc2VhIGVuIHRyYWJhbGxvcyBwYXRyb2NpbmFkb3Mgb3UgZmluYW5jaWFkb3MgcG9yIHVuaGEgb3JnYW5pemFjacOzbiBvdSBpbnN0aXR1Y2nDs24gZGlmZXJlbnRlIGRhIFVuaXZlcnNpZGFkZSBkZSBWaWdvLCBjdW1wcml1IGNvbiBjYWxxdWVyYSBkZXJlaXRvIG91IG9icmlnYSBlc3RhYmxlY2lkYSBwb2xvIGNvcnJlc3BvbmRlbnRlIGNvbnRyYXRvIG91IGFjb3JkbyBjb2EgZGV2YW5kaXRhIG9yZ2FuaXphY2nDs24uIAoKCkVuIHZpcnR1ZGUgZGEgcHJlc2VudGUgbGljZW56YSwgYSBVbml2ZXJzaWRhZGUgZGUgVmlnbyBjb21wcm9tw6l0ZXNlIGEgaWRlbnRpZmljYXIgY2xhcmFtZW50ZSBvIG5vbWUgZGFzIGF1dG9yYXMgZSBhdXRvcmVzLCBhc8OtIGNvbW8gZGFzIHByb3BpZXRhcmlhcyBvdSBwcm9waWV0YXJpb3MgZG9zIGRlcmVpdG9zIGRvIGRvY3VtZW50byBkZXBvc2l0YWRvLCBzZW4gcmVhbGl6YXJsbGUgbmluZ3VuaGEgbW9kaWZpY2FjacOzbiBhZ8OhcyBhcyBwZXJtaXRpZGFzIHBvciBlc3RhIGxpY2VuemEuCg== </ binData >
</ mdWrap >
</ rightsMD >
</ amdSec >
< amdSec ID =" FO_11093_11956_4 " >
< techMD ID =" TECH_O_11093_11956_4 " >
< mdWrap MDTYPE =" PREMIS " >
< xmlData schemaLocation =" http://www.loc.gov/standards/premis http://www.loc.gov/standards/premis/PREMIS-v1-0.xsd " >
< premis:premis >
< premis:object >
< premis:objectIdentifier >
< premis:objectIdentifierType > URL </ premis:objectIdentifierType >
< premis:objectIdentifierValue > https://www.investigo.biblioteca.uvigo.es/xmlui/bitstream/11093/11956/4/2026_balado_reality_applications.pdf </ premis:objectIdentifierValue >
</ premis:objectIdentifier >
< premis:objectCategory > File </ premis:objectCategory >
< premis:objectCharacteristics >
< premis:fixity >
< premis:messageDigestAlgorithm > MD5 </ premis:messageDigestAlgorithm >
< premis:messageDigest > 09cf6bd1dcdd5e09b131ccbdb452ec6b </ premis:messageDigest >
</ premis:fixity >
< premis:size > 1428279 </ premis:size >
< premis:format >
< premis:formatDesignation >
< premis:formatName > application/pdf </ premis:formatName >
</ premis:formatDesignation >
</ premis:format >
</ premis:objectCharacteristics >
< premis:originalName > 2026_balado_reality_applications.pdf </ premis:originalName >
</ premis:object >
</ premis:premis >
</ xmlData >
</ mdWrap >
</ techMD >
</ amdSec >
< fileSec >
< fileGrp USE =" ORIGINAL " >
< file ADMID =" FO_11093_11956_4 " CHECKSUM =" 09cf6bd1dcdd5e09b131ccbdb452ec6b " CHECKSUMTYPE =" MD5 " GROUPID =" GROUP_BITSTREAM_11093_11956_4 " ID =" BITSTREAM_ORIGINAL_11093_11956_4 " MIMETYPE =" application/pdf " SEQ =" 4 " SIZE =" 1428279 " >
</ file >
</ fileGrp >
</ fileSec >
< structMap LABEL =" DSpace Object " TYPE =" LOGICAL " >
< div ADMID =" DMD_11093_11956 " TYPE =" DSpace Object Contents " >
< div TYPE =" DSpace BITSTREAM " >
</ div >
</ div >
</ structMap >
</ mets >
<?xml version="1.0" encoding="UTF-8" ?>
< mods:mods schemaLocation =" http://www.loc.gov/mods/v3 http://www.loc.gov/standards/mods/v3/mods-3-1.xsd " >
< mods:name >
< mods:namePart > Balado Frías, Jesús </ mods:namePart >
</ mods:name >
< mods:name >
< mods:namePart > Feng, Yu </ mods:namePart >
</ mods:name >
< mods:name >
< mods:namePart > Qiu , Zhouyan </ mods:namePart >
</ mods:name >
< mods:name >
< mods:namePart > Gao, Weixiao </ mods:namePart >
</ mods:name >
< mods:name >
< mods:namePart > Julin, Arttu </ mods:namePart >
</ mods:name >
< mods:extension >
< mods:dateAvailable encoding =" iso8601 " > 2026-04-27T11:00:26Z </ mods:dateAvailable >
</ mods:extension >
< mods:extension >
< mods:dateAccessioned encoding =" iso8601 " > 2026-04-27T11:00:26Z </ mods:dateAccessioned >
</ mods:extension >
< mods:originInfo >
< mods:dateIssued encoding =" iso8601 " > 2026-04 </ mods:dateIssued >
</ mods:originInfo >
< mods:identifier type =" citation " > The Photogrammetric Record, 41(194): e70046 (2026) </ mods:identifier >
< mods:identifier type =" issn " > 0031868X </ mods:identifier >
< mods:identifier type =" issn " > 14779730 </ mods:identifier >
< mods:identifier type =" uri " > http://hdl.handle.net/11093/11956 </ mods:identifier >
< mods:identifier type =" doi " > 10.1111/phor.70046 </ mods:identifier >
< mods:identifier type =" editor " > https://onlinelibrary.wiley.com/doi/10.1111/phor.70046 </ mods:identifier >
< mods:abstract > Accurate integration and navigation of real‐world 3D spaces are fundamental for next‐generation Extended Reality (XR) systems, enhancing immersion, utility, and fidelity. This paper systematically reviews XR workflows using PRISMA guidelines, focusing on 3D data acquisition, modeling, visualization, and user interaction, based on 96 journal publications. Data collection for XR relies on photogrammetry, RGB‐D cameras, and LiDAR, often enhanced by multi‐sensor fusion, although real‐time transmission and semantic alignment remain challenging. XR pipelines are dominated by Building Information Modeling (BIM) software and game engines, frequently integrating Computer‐Aided Design (CAD) models and 3D scanned data. Visualization varies from photorealistic renderings to schematic representations, with Virtual Reality headsets favored for training and Augmented Reality devices applied in inspection and navigation. Interaction paradigms encompass controllers, gestures, gaze, voice, and haptics, with increasing reliance on Artificial Intelligence for multimodal fusion and processing. Despite progress, key challenges persist, including bandwidth limitations, manual 3D modeling, hybrid data management, interoperability issues, and scarcity of open‐source solutions. Additional identified barriers involve balancing visual quality with performance in specific contexts, limited accuracy of non‐invasive Brain‐Computer Interfaces, and restricted market acceptance due to high costs. Overall, XR adoption remains constrained by technical, usability, and accessibility gaps. </ mods:abstract >
< mods:language >
< mods:languageTerm > eng </ mods:languageTerm >
</ mods:language >
< mods:accessCondition type =" useAndReproduction " > https://creativecommons.org/licenses/by/4.0/ </ mods:accessCondition >
< mods:accessCondition type =" useAndReproduction " > openAccess </ mods:accessCondition >
< mods:accessCondition type =" useAndReproduction " > Attribution 4.0 International </ mods:accessCondition >
< mods:titleInfo >
< mods:title > 3D as‐built environments in extended reality applications: a systematic review </ mods:title >
</ mods:titleInfo >
< mods:genre > article </ mods:genre >
</ mods:mods >
<?xml version="1.0" encoding="UTF-8" ?>
< atom:entry schemaLocation =" http://www.w3.org/2005/Atom http://www.kbcafe.com/rss/atom.xsd.xml " >
< atom:id > http://hdl.handle.net/11093/11956/ore.xml </ atom:id >
< atom:published > 2026-04-27T11:00:26Z </ atom:published >
< atom:updated > 2026-04-27T11:00:26Z </ atom:updated >
< atom:source >
< atom:generator > Investigo </ atom:generator >
</ atom:source >
< atom:title > 3D as‐built environments in extended reality applications: a systematic review </ atom:title >
< atom:author >
< atom:name > Balado Frías, Jesús </ atom:name >
</ atom:author >
< atom:author >
< atom:name > Feng, Yu </ atom:name >
</ atom:author >
< atom:author >
< atom:name > Qiu , Zhouyan </ atom:name >
</ atom:author >
< atom:author >
< atom:name > Gao, Weixiao </ atom:name >
</ atom:author >
< atom:author >
< atom:name > Julin, Arttu </ atom:name >
</ atom:author >
< oreatom:triples >
< rdf:Description about =" http://hdl.handle.net/11093/11956/ore.xml#atom " >
< dcterms:modified > 2026-04-27T11:00:26Z </ dcterms:modified >
</ rdf:Description >
< rdf:Description about =" https://www.investigo.biblioteca.uvigo.es/xmlui/bitstream/11093/11956/4/2026_balado_reality_applications.pdf " >
< dcterms:description > ORIGINAL </ dcterms:description >
</ rdf:Description >
< rdf:Description about =" https://www.investigo.biblioteca.uvigo.es/xmlui/bitstream/11093/11956/2/license.txt " >
< dcterms:description > LICENSE </ dcterms:description >
</ rdf:Description >
< rdf:Description about =" https://www.investigo.biblioteca.uvigo.es/xmlui/bitstream/11093/11956/3/sword.zip " >
< dcterms:description > SWORD </ dcterms:description >
</ rdf:Description >
</ oreatom:triples >
</ atom:entry >
<?xml version="1.0" encoding="UTF-8" ?>
< qdc:qualifieddc schemaLocation =" http://purl.org/dc/elements/1.1/ http://dublincore.org/schemas/xmls/qdc/2006/01/06/dc.xsd http://purl.org/dc/terms/ http://dublincore.org/schemas/xmls/qdc/2006/01/06/dcterms.xsd http://dspace.org/qualifieddc/ http://www.ukoln.ac.uk/metadata/dcmi/xmlschema/qualifieddc.xsd " >
< dc:title > 3D as‐built environments in extended reality applications: a systematic review </ dc:title >
< dc:creator > Balado Frías, Jesús </ dc:creator >
< dc:creator > Feng, Yu </ dc:creator >
< dc:creator > Qiu , Zhouyan </ dc:creator >
< dc:creator > Gao, Weixiao </ dc:creator >
< dc:creator > Julin, Arttu </ dc:creator >
< dcterms:abstract > Accurate integration and navigation of real‐world 3D spaces are fundamental for next‐generation Extended Reality (XR) systems, enhancing immersion, utility, and fidelity. This paper systematically reviews XR workflows using PRISMA guidelines, focusing on 3D data acquisition, modeling, visualization, and user interaction, based on 96 journal publications. Data collection for XR relies on photogrammetry, RGB‐D cameras, and LiDAR, often enhanced by multi‐sensor fusion, although real‐time transmission and semantic alignment remain challenging. XR pipelines are dominated by Building Information Modeling (BIM) software and game engines, frequently integrating Computer‐Aided Design (CAD) models and 3D scanned data. Visualization varies from photorealistic renderings to schematic representations, with Virtual Reality headsets favored for training and Augmented Reality devices applied in inspection and navigation. Interaction paradigms encompass controllers, gestures, gaze, voice, and haptics, with increasing reliance on Artificial Intelligence for multimodal fusion and processing. Despite progress, key challenges persist, including bandwidth limitations, manual 3D modeling, hybrid data management, interoperability issues, and scarcity of open‐source solutions. Additional identified barriers involve balancing visual quality with performance in specific contexts, limited accuracy of non‐invasive Brain‐Computer Interfaces, and restricted market acceptance due to high costs. Overall, XR adoption remains constrained by technical, usability, and accessibility gaps. </ dcterms:abstract >
< dcterms:dateAccepted > 2026-04-27T11:00:26Z </ dcterms:dateAccepted >
< dcterms:available > 2026-04-27T11:00:26Z </ dcterms:available >
< dcterms:created > 2026-04-27T11:00:26Z </ dcterms:created >
< dcterms:issued > 2026-04 </ dcterms:issued >
< dc:type > article </ dc:type >
< dc:identifier > The Photogrammetric Record, 41(194): e70046 (2026) </ dc:identifier >
< dc:identifier > 0031868X </ dc:identifier >
< dc:identifier > 14779730 </ dc:identifier >
< dc:identifier > http://hdl.handle.net/11093/11956 </ dc:identifier >
< dc:identifier > 10.1111/phor.70046 </ dc:identifier >
< dc:identifier > https://onlinelibrary.wiley.com/doi/10.1111/phor.70046 </ dc:identifier >
< dc:language > eng </ dc:language >
< dc:relation > info:eu-repo/grantAgreement/EC/HE/101129961 </ dc:relation >
< dc:relation > info:eu-repo/grantAgreement/AEI/Plan Estatal de Investigación Científica y Técnica y de Innovación 2021-2023/PID2021-123475OA-I00/ES </ dc:relation >
< dc:relation > info:eu-repo/grantAgreement/AEI/Plan Estatal de Investigación Científica y Técnica y de Innovación 2021-2023/RYC2022-038100-I/ES </ dc:relation >
< dc:rights > https://creativecommons.org/licenses/by/4.0/ </ dc:rights >
< dc:rights > openAccess </ dc:rights >
< dc:rights > Attribution 4.0 International </ dc:rights >
< dc:publisher > The Photogrammetric Record </ dc:publisher >
< dc:publisher > Enxeñaría dos recursos naturais e medio ambiente </ dc:publisher >
< dc:publisher > Xeotecnoloxías Aplicadas </ dc:publisher >
</ qdc:qualifieddc >
<?xml version="1.0" encoding="UTF-8" ?>
< rdf:RDF schemaLocation =" http://www.openarchives.org/OAI/2.0/rdf/ http://www.openarchives.org/OAI/2.0/rdf.xsd " >
< ow:Publication about =" oai:www.investigo.biblioteca.uvigo.es:11093/11956 " >
< dc:title > 3D as‐built environments in extended reality applications: a systematic review </ dc:title >
< dc:creator > Balado Frías, Jesús </ dc:creator >
< dc:creator > Feng, Yu </ dc:creator >
< dc:creator > Qiu , Zhouyan </ dc:creator >
< dc:creator > Gao, Weixiao </ dc:creator >
< dc:creator > Julin, Arttu </ dc:creator >
< dc:description > Accurate integration and navigation of real‐world 3D spaces are fundamental for next‐generation Extended Reality (XR) systems, enhancing immersion, utility, and fidelity. This paper systematically reviews XR workflows using PRISMA guidelines, focusing on 3D data acquisition, modeling, visualization, and user interaction, based on 96 journal publications. Data collection for XR relies on photogrammetry, RGB‐D cameras, and LiDAR, often enhanced by multi‐sensor fusion, although real‐time transmission and semantic alignment remain challenging. XR pipelines are dominated by Building Information Modeling (BIM) software and game engines, frequently integrating Computer‐Aided Design (CAD) models and 3D scanned data. Visualization varies from photorealistic renderings to schematic representations, with Virtual Reality headsets favored for training and Augmented Reality devices applied in inspection and navigation. Interaction paradigms encompass controllers, gestures, gaze, voice, and haptics, with increasing reliance on Artificial Intelligence for multimodal fusion and processing. Despite progress, key challenges persist, including bandwidth limitations, manual 3D modeling, hybrid data management, interoperability issues, and scarcity of open‐source solutions. Additional identified barriers involve balancing visual quality with performance in specific contexts, limited accuracy of non‐invasive Brain‐Computer Interfaces, and restricted market acceptance due to high costs. Overall, XR adoption remains constrained by technical, usability, and accessibility gaps. </ dc:description >
< dc:date > 2026-04-27T11:00:26Z </ dc:date >
< dc:date > 2026-04-27T11:00:26Z </ dc:date >
< dc:date > 2026-04 </ dc:date >
< dc:date > 2026-04-21T07:45:33Z </ dc:date >
< dc:type > article </ dc:type >
< dc:identifier > The Photogrammetric Record, 41(194): e70046 (2026) </ dc:identifier >
< dc:identifier > 0031868X </ dc:identifier >
< dc:identifier > 14779730 </ dc:identifier >
< dc:identifier > http://hdl.handle.net/11093/11956 </ dc:identifier >
< dc:identifier > 10.1111/phor.70046 </ dc:identifier >
< dc:identifier > https://onlinelibrary.wiley.com/doi/10.1111/phor.70046 </ dc:identifier >
< dc:language > eng </ dc:language >
< dc:relation > info:eu-repo/grantAgreement/EC/HE/101129961 </ dc:relation >
< dc:relation > info:eu-repo/grantAgreement/AEI/Plan Estatal de Investigación Científica y Técnica y de Innovación 2021-2023/PID2021-123475OA-I00/ES </ dc:relation >
< dc:relation > info:eu-repo/grantAgreement/AEI/Plan Estatal de Investigación Científica y Técnica y de Innovación 2021-2023/RYC2022-038100-I/ES </ dc:relation >
< dc:rights > https://creativecommons.org/licenses/by/4.0/ </ dc:rights >
< dc:rights > openAccess </ dc:rights >
< dc:rights > Attribution 4.0 International </ dc:rights >
< dc:publisher > The Photogrammetric Record </ dc:publisher >
< dc:publisher > Enxeñaría dos recursos naturais e medio ambiente </ dc:publisher >
< dc:publisher > Xeotecnoloxías Aplicadas </ dc:publisher >
</ ow:Publication >
</ rdf:RDF >
<?xml version="1.0" encoding="UTF-8" ?>
< oai_dc:dc schemaLocation =" http://www.openarchives.org/OAI/2.0/oai_dc/ http://www.openarchives.org/OAI/2.0/oai_dc.xsd " >
< dcterms:dateAccepted > 2026-04-27T11:00:26Z </ dcterms:dateAccepted >
< dcterms:available > 2026-04-27T11:00:26Z </ dcterms:available >
< dcterms:issued > 2026-04 </ dcterms:issued >
< dcterms:identifier_bibliographicCitation lang =" spa " > The Photogrammetric Record, 41(194): e70046 (2026) </ dcterms:identifier_bibliographicCitation >
< dcterms:identifier_issn > 0031868X </ dcterms:identifier_issn >
< dcterms:identifier_issn > 14779730 </ dcterms:identifier_issn >
< dcterms:identifier_doi > 10.1111/phor.70046 </ dcterms:identifier_doi >
< dcterms:identifier type =" dcterms:URI " > http://hdl.handle.net/11093/11956 </ dcterms:identifier >
< dcterms:identifier_editor lang =" spa " > https://onlinelibrary.wiley.com/doi/10.1111/phor.70046 </ dcterms:identifier_editor >
< dcterms:abstract lang =" en " > Accurate integration and navigation of real‐world 3D spaces are fundamental for next‐generation Extended Reality (XR) systems, enhancing immersion, utility, and fidelity. This paper systematically reviews XR workflows using PRISMA guidelines, focusing on 3D data acquisition, modeling, visualization, and user interaction, based on 96 journal publications. Data collection for XR relies on photogrammetry, RGB‐D cameras, and LiDAR, often enhanced by multi‐sensor fusion, although real‐time transmission and semantic alignment remain challenging. XR pipelines are dominated by Building Information Modeling (BIM) software and game engines, frequently integrating Computer‐Aided Design (CAD) models and 3D scanned data. Visualization varies from photorealistic renderings to schematic representations, with Virtual Reality headsets favored for training and Augmented Reality devices applied in inspection and navigation. Interaction paradigms encompass controllers, gestures, gaze, voice, and haptics, with increasing reliance on Artificial Intelligence for multimodal fusion and processing. Despite progress, key challenges persist, including bandwidth limitations, manual 3D modeling, hybrid data management, interoperability issues, and scarcity of open‐source solutions. Additional identified barriers involve balancing visual quality with performance in specific contexts, limited accuracy of non‐invasive Brain‐Computer Interfaces, and restricted market acceptance due to high costs. Overall, XR adoption remains constrained by technical, usability, and accessibility gaps. </ dcterms:abstract >
< dcterms:description_sponsorship lang =" spa " > Universidade de Vigo/CISUG </ dcterms:description_sponsorship >
< dcterms:description_sponsorship lang =" spa " > Xunta de Galicia | Ref. EDC431C 2024/30 </ dcterms:description_sponsorship >
< dcterms:description_sponsorship lang =" spa " > Xunta de Galicia | Ref. ED431F 2024/06 </ dcterms:description_sponsorship >
< dcterms:description_sponsorship lang =" spa " > Agencia Estatal de Investigación | Ref. RYC2022-038100-I </ dcterms:description_sponsorship >
< dcterms:description_sponsorship lang =" spa " > Agencia Estatal de Investigación | Ref. PID2021-123475OA-I00 </ dcterms:description_sponsorship >
< dcterms:description_sponsorship lang =" spa " > Deutsche Forschungsgemeinschaft | Ref. 499168241 </ dcterms:description_sponsorship >
< dcterms:description_sponsorship lang =" spa " > National Research Council of Finland </ dcterms:description_sponsorship >
< dcterms:description_sponsorship lang =" spa " > Business Finland | Ref. MIXER (3475/31/2023) </ dcterms:description_sponsorship >
< dcterms:language type =" dcterms:ISO639-2 " lang =" spa " > eng </ dcterms:language >
< dcterms:publisher lang =" spa " > The Photogrammetric Record </ dcterms:publisher >
< dcterms:relation > info:eu-repo/grantAgreement/AEI/Plan Estatal de Investigación Científica y Técnica y de Innovación 2021-2023/PID2021-123475OA-I00/ES </ dcterms:relation >
< dcterms:relation > info:eu-repo/grantAgreement/AEI/Plan Estatal de Investigación Científica y Técnica y de Innovación 2021-2023/RYC2022-038100-I/ES </ dcterms:relation >
< dcterms:relation lang =" spa " > info:eu-repo/grantAgreement/EC/HE/101129961 </ dcterms:relation >
< dcterms:rights > Attribution 4.0 International </ dcterms:rights >
< dcterms:accessRights lang =" spa " > openAccess </ dcterms:accessRights >
< dcterms:rights_uri type =" dcterms:URI " > https://creativecommons.org/licenses/by/4.0/ </ dcterms:rights_uri >
< dcterms:title lang =" en " > 3D as‐built environments in extended reality applications: a systematic review </ dcterms:title >
< dcterms:type lang =" spa " > article </ dcterms:type >
< dcterms:computerCitation lang =" spa " > pub_title=The Photogrammetric Record|volume=41|journal_number=194|start_pag=e70046|end_pag= </ dcterms:computerCitation >
< dcterms:publisher_department lang =" spa " > Enxeñaría dos recursos naturais e medio ambiente </ dcterms:publisher_department >
< dcterms:publisher_group lang =" spa " > Xeotecnoloxías Aplicadas </ dcterms:publisher_group >
< dcterms:subject lang =" spa " > 3308 Ingeniería y Tecnología del Medio Ambiente </ dcterms:subject >
< dcterms:authorList > 8101#Feng, Yu#8659#Gao, Weixiao#Julin, Arttu </ dcterms:authorList >
</ oai_dc:dc >
Se ha omitido la presentación del registro por ser demasiado largo. Si lo desea, puede descargárselo en el enlace anterior.
Xunta de Galicia. Información mantenida y publicada en internet por la Xunta de Galicia
Atención a la ciudadanía - Accesibilidad - Aviso legal - Mapa del portal