<?xml version="1.0"?>
<feed xmlns="http://www.w3.org/2005/Atom" xml:lang="en">
		<id>https://old.web3d.org/wiki/api.php?action=feedcontributions&amp;feedformat=atom&amp;user=Yjung</id>
		<title>Web3D.org - User contributions [en]</title>
		<link rel="self" type="application/atom+xml" href="https://old.web3d.org/wiki/api.php?action=feedcontributions&amp;feedformat=atom&amp;user=Yjung"/>
		<link rel="alternate" type="text/html" href="https://old.web3d.org/wiki/index.php/Special:Contributions/Yjung"/>
		<updated>2026-04-29T01:07:19Z</updated>
		<subtitle>User contributions</subtitle>
		<generator>MediaWiki 1.25.1</generator>

	<entry>
		<id>https://old.web3d.org/wiki/index.php?title=X3D_version_3.4_Development&amp;diff=8512</id>
		<title>X3D version 3.4 Development</title>
		<link rel="alternate" type="text/html" href="https://old.web3d.org/wiki/index.php?title=X3D_version_3.4_Development&amp;diff=8512"/>
				<updated>2014-08-08T18:49:28Z</updated>
		
		<summary type="html">&lt;p&gt;Yjung: /* Candidate capabilities */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;__TOC__&lt;br /&gt;
== Strategic overview ==&lt;br /&gt;
&lt;br /&gt;
[[X3D version 3.4 Development]] efforts are evolutionary improvements to the widely proven X3D Graphics architecture.&lt;br /&gt;
&lt;br /&gt;
Web3D Consortium working groups currently define specification goals and requirements. Working group efforts are often the focus for defining and testing new X3D components.&lt;br /&gt;
&lt;br /&gt;
We publicly review these goals annually during [http://www.web3d2014.org Web3D Conference] and [http://s2014.siggraph.org/attendees/birds-feather SIGGRAPH BOF] meetings.&lt;br /&gt;
&lt;br /&gt;
Suggestions, development and discussion via the [http://web3d.org/mailman/listinfo/x3d-public_web3d.org x3d-public mailing list] is ongoing.&lt;br /&gt;
X3D version 3.4 progress also informs and helps to extend [[X3D version 4.0 Development]].&lt;br /&gt;
&lt;br /&gt;
The following list shows that a lot of interesting capabilities have been proposed and are under way for X3D version 3.4. However, topics on this list are not guaranteed to be completed! Rather these are all works in progress.&lt;br /&gt;
&lt;br /&gt;
Activity and approval proceeds based on technical contributions and Web3D Consortium Member priorities. Please consider [http://web3d.org/membership/join joining Web3D] to help advance 3D graphics on the Web.&lt;br /&gt;
&lt;br /&gt;
== Candidate capabilities ==&lt;br /&gt;
&lt;br /&gt;
Each of the following possibilities for X3D 3.4 have been discussed by the various X3D working groups during meetings and on mailing lists.&lt;br /&gt;
Each potential capability is considered to be a feasible (and in most cases, straightforward) addition to the existing X3D version 3.3 architecture.&lt;br /&gt;
&lt;br /&gt;
*'''Appearance'''&lt;br /&gt;
**'''Materials''': advanced parameters&lt;br /&gt;
**[[X3D Multitexture | Multitexture]]: review for correctness, completeness and conformance of rendering example scenes&lt;br /&gt;
**'''Rendering''': bump maps, shadows, edge smoothing&lt;br /&gt;
**[http://www.web3d.org/files/specifications/19775-1/V3.3/Part01/components/shaders.html Shaders]: improved support and better interoperability, library of examples&lt;br /&gt;
**'''Texturing''': [http://en.wikipedia.org/wiki/Texture_atlas Texture atlas], [http://en.wikipedia.org/wiki/Projective_texture_mapping projective texture mapping (PTM)], [http://www.xj3d.org/extensions/render_texture.html RenderedTexture node] (for multipass rendering - 2d texture version of GeneratedCubeMapTexture, first proposed by Xj3D and also impl. in X3DOM and InstantReality, useful for all kinds of NPR, shadows, mirrors, etc.), as well as required or recommended formats for imagery and video (.gif .bmp .svg .flv .exr .hdr etc.)&lt;br /&gt;
*'''Audio and video''': adding royalty-free formats, streamability, [http://web3d.org/pipermail/x3d-public_web3d.org/2013-December/002681.html disabling attenuation], 3D aural spatialization using reflection from simple geometry (such as [http://gamma.cs.unc.edu/Sound/RESound RESOUND] or [https://dvcs.w3.org/hg/audio/raw-file/tip/webaudio/specification.html Web Audio API])&lt;br /&gt;
*'''[http://www.web3d.org/realtime-3d/working-groups/computer-aided-design-cad Computer Aided Design (CAD)]''' Interactive/Mobile Profile, to include:&lt;br /&gt;
**[http://www.web3d.org/files/specifications/19775-1/V3.3/Part01/CADInterchange.html CADInterchange profile] plus FillProperties/LineProperties, primitive/Geometry2D nodes, Extrusion, NURBS, ClipPlane&lt;br /&gt;
**Part selection/animation, 3D printing, [http://www.web3d.org/realtime-3d/news/3d-graphics-compress-call-contributions Compressed Binary Encoding (CBE)], possibly [http://svn.xj3d.org/xj3d_website/trunk/extensions/annotation.html annotations component]&lt;br /&gt;
** Building Information Models (BIM), Architecture Engineering Construction (AEC), Physical Sensors&lt;br /&gt;
*'''[http://www.ecma-international.org/publications/standards/Ecma-262.htm ECMAScript]''' (Javascript) specification revision compatibility with [http://www.web3d.org/files/specifications/19777-1/V3.0/index.html X3D scripting]; possibly add C# or Python support&lt;br /&gt;
*'''Generalized input/output interface support'''&lt;br /&gt;
**Possibly [http://www.cs.unc.edu/Research/vrpn/index.html Virtual Reality Peripheral Network (VRPN)], gesture recognition (such as [http://en.wikipedia.org/wiki/Kinect KINECT], [https://www.leapmotion.com LEAP]), etc.&lt;br /&gt;
** Support for arbitrary sensors and user interaction devices&lt;br /&gt;
* '''Geometry''': point size (or perspective rendering), progressive meshes (suitable for both compression and streaming), 3D ExtrudedText, support for [https://en.wikipedia.org/wiki/Web_typography Web typography] using [http://www.w3.org/TR/WOFF Web Open Fonts Format (WOFF)]&lt;br /&gt;
*'''[http://www.web3d.org/realtime-3d/working-groups/x3d-earth Geospatial X3D]''' component: [http://www.igraphics.com/Standards/EnhancedGeospatialComponent_2007_10_30/Part01/X3D.html spatial reference frame (SRF)] and [http://www.opengeospatial.org/standards/kml KML] support, [http://www.opengeospatial.org/projects/initiatives/3dpie OGC 3D Portrayal], [http://web3d.org/pipermail/x3d-public_web3d.org/2010-December/001187.html GpsSensor], [http://openlayers.org OpenLayer] mashups&lt;br /&gt;
*'''[http://www.web3d.org/realtime-3d/working-groups/h-anim Humanoid Animation (H-Anim)]''' anatomical correctness for skeleton and skinning, motion capture and playback, interchangeable avatars, animation for hands feet and faces&lt;br /&gt;
* '''Interoperability''': include ''class'' attribute for all nodes to all encodings&lt;br /&gt;
* '''[http://www.json.org JSON]''': JavaScript Object Notation as an X3D encoding ([http://web3d.org/pipermail/x3d-public_web3d.org/2014-July/thread.html#2854 assessment thread]), relation to [https://www.khronos.org/gltf GlTF], streaming considerations&lt;br /&gt;
*'''[http://www.web3d.org/realtime-3d/working-groups/medx3d Medical working group]''' capabilities&lt;br /&gt;
** [http://svn.xj3d.org/xj3d_website/trunk/extensions/annotation.html Annotations component] and metadata usage&lt;br /&gt;
** Archival 3D medical records, potential emphasis on [http://en.wikipedia.org/wiki/Traumatic_brain_injury Traumatic brain injury (TBI)] volume visualization&lt;br /&gt;
** Haptics component for force feedback&lt;br /&gt;
** Soft-body physics component to complement rigid-body physics component&lt;br /&gt;
* '''Mixed and Augmented Reality (MAR)''': integration of multiple capabilities with mobile devices&lt;br /&gt;
*'''Networking''': consider [http://www.web3d.org/x3d/content/examples/Basic/Networking NetworkSensor] and event-passing issues, streaming using [http://www.json.org JSON], server-side 3D topics&lt;br /&gt;
*'''Security and privacy''':&lt;br /&gt;
** [http://www.w3.org/standards/xml/security XML Security] provides best-available encryption, digital signature (authentication)&lt;br /&gt;
** [http://www.w3.org/standards/webdesign/privacy Web Privacy]: examine X3D compatibility with Do Not Track, P3P, POWDER&lt;br /&gt;
** Review X3D specifications to ensure that Security Considerations are fully documented&lt;br /&gt;
*'''Viewing and navigation''': cinematic camera control, alternative navigation types (such as PAN, [http://www.x3dom.org/?p=3536 TURNTABLE] etc.), [http://www.web3d.org/files/specifications/19775-1/V3.3/Part01/behaviours.html Recommended navigation behaviours] review, and old MatrixTransform node (esp. useful for CAD, VR/AR etc., impl. in X3DOM and InstantReality)&lt;br /&gt;
&lt;br /&gt;
All suggestions and recommendations are welcome. Component improvements and additions are approved by Web3D Consortium members.&lt;br /&gt;
&lt;br /&gt;
Please [http://www.web3d.org/realtime-3d/contact contact us] if you think additional technologies need to be considered.&lt;br /&gt;
&lt;br /&gt;
== Backwards and forwards compatibility ==&lt;br /&gt;
&lt;br /&gt;
Thanks to careful design and insistence on implementation/evaluation, the X3D International Standard has maintained both steady growth and interoperability ever since Virtual Reality Modeling Language (VRML) in 1997. This track record of stability and innovation is among the best in the 3D graphics industry.&lt;br /&gt;
&lt;br /&gt;
[[X3D version 4.0 Development]] efforts are focused on HTML5/Declarative 3D/X3DOM and Augmented Reality Continuum (ARC) technologies, which may require architectural changes. Some new technologies may get pushed from 4.0 to 3.4 (or back again) after careful consideration by the respective working groups.&lt;br /&gt;
&lt;br /&gt;
*As with all other X3D components, all work is defined in the abstract specification has corresponding file encodings (.x3d .x3dv .x3db) and language bindings (ECMAScript and Java). &lt;br /&gt;
*Compatibility concerns include evolutionary efforts to upgrade the X3D Compressed Binary Encoding (CBE), as described in the [http://www.web3d.org/realtime-3d/working-groups/x3d/compressed-binary/x3d-compressed-binary-encoding-call-contributions X3D Compressed Binary Encoding Call For Contributions].&lt;br /&gt;
*ECMAscript (JavaScript) support in X3D needs to be upgraded to the new standard for that rapidly improving programming language.&lt;br /&gt;
**[http://standards.iso.org/ittf/PubliclyAvailableStandards/c055755_ISO_IEC_16262_2011(E).zip ISO/IEC 16262:2011 Information technology — ECMAScript language specification] (.zip download)&lt;br /&gt;
**Downloadable from [http://standards.iso.org/ittf/PubliclyAvailableStandards/index.html ISO Publicly Available Standards] site without charge&lt;br /&gt;
**This relates to [http://www.web3d.org/files/specifications/19777-1/V3.0/index.html 19777-1 Part 2, X3D Scene Access Interface (SAI) language bindings for EcmaScript]&lt;br /&gt;
&lt;br /&gt;
== Schedule ==&lt;br /&gt;
&lt;br /&gt;
*'''X3D CADInterchange Profile goal.''' Implementations are complete and tested. The [http://www.web3d.org/files/specifications/19775-1/V3.3/Part01/CADInterchange.html X3D CADInterchange Profile] was completed as part of X3D version 3.3 during 2013.&lt;br /&gt;
*'''Mobile Profile.''' Calling out a reduced palette for mobile devices remains a potential goal for 2014, but might instead become part of X3D version 4.0 efforts.&lt;br /&gt;
*'''X3D Compressed Binary Encoding (CBE) goal.''' This work is proceeding in parallel.&lt;br /&gt;
*'''X3D version 3.4 goal.''' Review progress during SIGGRAPH 2014, continue work in parallel with X3D version 4.0. Web3D Consortium members decide when a draft specification proceeds to ISO.&lt;/div&gt;</summary>
		<author><name>Yjung</name></author>	</entry>

	<entry>
		<id>https://old.web3d.org/wiki/index.php?title=X3D_version_3.4_Development&amp;diff=8511</id>
		<title>X3D version 3.4 Development</title>
		<link rel="alternate" type="text/html" href="https://old.web3d.org/wiki/index.php?title=X3D_version_3.4_Development&amp;diff=8511"/>
				<updated>2014-08-08T18:48:27Z</updated>
		
		<summary type="html">&lt;p&gt;Yjung: /* Candidate capabilities */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;__TOC__&lt;br /&gt;
== Strategic overview ==&lt;br /&gt;
&lt;br /&gt;
[[X3D version 3.4 Development]] efforts are evolutionary improvements to the widely proven X3D Graphics architecture.&lt;br /&gt;
&lt;br /&gt;
Web3D Consortium working groups currently define specification goals and requirements. Working group efforts are often the focus for defining and testing new X3D components.&lt;br /&gt;
&lt;br /&gt;
We publicly review these goals annually during [http://www.web3d2014.org Web3D Conference] and [http://s2014.siggraph.org/attendees/birds-feather SIGGRAPH BOF] meetings.&lt;br /&gt;
&lt;br /&gt;
Suggestions, development and discussion via the [http://web3d.org/mailman/listinfo/x3d-public_web3d.org x3d-public mailing list] is ongoing.&lt;br /&gt;
X3D version 3.4 progress also informs and helps to extend [[X3D version 4.0 Development]].&lt;br /&gt;
&lt;br /&gt;
The following list shows that a lot of interesting capabilities have been proposed and are under way for X3D version 3.4. However, topics on this list are not guaranteed to be completed! Rather these are all works in progress.&lt;br /&gt;
&lt;br /&gt;
Activity and approval proceeds based on technical contributions and Web3D Consortium Member priorities. Please consider [http://web3d.org/membership/join joining Web3D] to help advance 3D graphics on the Web.&lt;br /&gt;
&lt;br /&gt;
== Candidate capabilities ==&lt;br /&gt;
&lt;br /&gt;
Each of the following possibilities for X3D 3.4 have been discussed by the various X3D working groups during meetings and on mailing lists.&lt;br /&gt;
Each potential capability is considered to be a feasible (and in most cases, straightforward) addition to the existing X3D version 3.3 architecture.&lt;br /&gt;
&lt;br /&gt;
*'''Appearance'''&lt;br /&gt;
**'''Materials''': advanced parameters&lt;br /&gt;
**[[X3D Multitexture | Multitexture]]: review for correctness, completeness and conformance of rendering example scenes&lt;br /&gt;
**'''Rendering''': bump maps, shadows, edge smoothing&lt;br /&gt;
**[http://www.web3d.org/files/specifications/19775-1/V3.3/Part01/components/shaders.html Shaders]: improved support and better interoperability, library of examples&lt;br /&gt;
**'''Texturing''': [http://en.wikipedia.org/wiki/Texture_atlas Texture atlas], [http://en.wikipedia.org/wiki/Projective_texture_mapping projective texture mapping (PTM)], [http://www.xj3d.org/extensions/render_texture.html RenderedTexture node] (for multipass rendering - 2d texture version of GeneratedCubeMapTexture, first proposed by Xj3D and also impl. in X3DOM and InstantReality, useful for all kinds of NPR, shadows, mirrors, etc.), as well as required or recommended formats for imagery and video (.gif .bmp .svg .flv .exr .hdr etc.)&lt;br /&gt;
*'''Audio and video''': adding royalty-free formats, streamability, [http://web3d.org/pipermail/x3d-public_web3d.org/2013-December/002681.html disabling attenuation], 3D aural spatialization using reflection from simple geometry (such as [http://gamma.cs.unc.edu/Sound/RESound RESOUND] or [https://dvcs.w3.org/hg/audio/raw-file/tip/webaudio/specification.html Web Audio API])&lt;br /&gt;
*'''[http://www.web3d.org/realtime-3d/working-groups/computer-aided-design-cad Computer Aided Design (CAD)]''' Interactive/Mobile Profile, to include:&lt;br /&gt;
**[http://www.web3d.org/files/specifications/19775-1/V3.3/Part01/CADInterchange.html CADInterchange profile] plus FillProperties/LineProperties, primitive/Geometry2D nodes, Extrusion, NURBS, ClipPlane&lt;br /&gt;
**Part selection/animation, 3D printing, [http://www.web3d.org/realtime-3d/news/3d-graphics-compress-call-contributions Compressed Binary Encoding (CBE)], possibly [http://svn.xj3d.org/xj3d_website/trunk/extensions/annotation.html annotations component]&lt;br /&gt;
** Building Information Models (BIM), Architecture Engineering Construction (AEC), Physical Sensors&lt;br /&gt;
*'''[http://www.ecma-international.org/publications/standards/Ecma-262.htm ECMAScript]''' (Javascript) specification revision compatibility with [http://www.web3d.org/files/specifications/19777-1/V3.0/index.html X3D scripting]; possibly add C# or Python support&lt;br /&gt;
*'''Generalized input/output interface support'''&lt;br /&gt;
**Possibly [http://www.cs.unc.edu/Research/vrpn/index.html Virtual Reality Peripheral Network (VRPN)], gesture recognition (such as [http://en.wikipedia.org/wiki/Kinect KINECT], [https://www.leapmotion.com LEAP]), etc.&lt;br /&gt;
** Support for arbitrary sensors and user interaction devices&lt;br /&gt;
* '''Geometry''': point size (or perspective rendering), progressive meshes (suitable for both compression and streaming), 3D ExtrudedText, support for [https://en.wikipedia.org/wiki/Web_typography Web typography] using [http://www.w3.org/TR/WOFF Web Open Fonts Format (WOFF)]&lt;br /&gt;
*'''[http://www.web3d.org/realtime-3d/working-groups/x3d-earth Geospatial X3D]''' component: [http://www.igraphics.com/Standards/EnhancedGeospatialComponent_2007_10_30/Part01/X3D.html spatial reference frame (SRF)] and [http://www.opengeospatial.org/standards/kml KML] support, [http://www.opengeospatial.org/projects/initiatives/3dpie OGC 3D Portrayal], [http://web3d.org/pipermail/x3d-public_web3d.org/2010-December/001187.html GpsSensor], [http://openlayers.org OpenLayer] mashups&lt;br /&gt;
*'''[http://www.web3d.org/realtime-3d/working-groups/h-anim Humanoid Animation (H-Anim)]''' anatomical correctness for skeleton and skinning, motion capture and playback, interchangeable avatars, animation for hands feet and faces&lt;br /&gt;
* '''Interoperability''': include ''class'' attribute for all nodes to all encodings&lt;br /&gt;
* '''[http://www.json.org JSON]''': JavaScript Object Notation as an X3D encoding ([http://web3d.org/pipermail/x3d-public_web3d.org/2014-July/thread.html#2854 assessment thread]), relation to [https://www.khronos.org/gltf GlTF], streaming considerations&lt;br /&gt;
*'''[http://www.web3d.org/realtime-3d/working-groups/medx3d Medical working group]''' capabilities&lt;br /&gt;
** [http://svn.xj3d.org/xj3d_website/trunk/extensions/annotation.html Annotations component] and metadata usage&lt;br /&gt;
** Archival 3D medical records, potential emphasis on [http://en.wikipedia.org/wiki/Traumatic_brain_injury Traumatic brain injury (TBI)] volume visualization&lt;br /&gt;
** Haptics component for force feedback&lt;br /&gt;
** Soft-body physics component to complement rigid-body physics component&lt;br /&gt;
* '''Mixed and Augmented Reality (MAR)''': integration of multiple capabilities with mobile devices&lt;br /&gt;
*'''Networking''': consider [http://www.web3d.org/x3d/content/examples/Basic/Networking NetworkSensor] and event-passing issues, streaming using [http://www.json.org JSON], server-side 3D topics&lt;br /&gt;
*'''Security and privacy''':&lt;br /&gt;
** [http://www.w3.org/standards/xml/security XML Security] provides best-available encryption, digital signature (authentication)&lt;br /&gt;
** [http://www.w3.org/standards/webdesign/privacy Web Privacy]: examine X3D compatibility with Do Not Track, P3P, POWDER&lt;br /&gt;
** Review X3D specifications to ensure that Security Considerations are fully documented&lt;br /&gt;
*'''Viewing and navigation''': cinematic camera control, alternative navigation types (such as PAN, [http://www.x3dom.org/?p=3536 TURNTABLE] etc.), [http://www.web3d.org/files/specifications/19775-1/V3.3/Part01/behaviours.html Recommended navigation behaviours] review, and MatrixTransform node (esp. useful for CAD, VR/AR etc., impl. in X3DOM and InstantREality)&lt;br /&gt;
&lt;br /&gt;
All suggestions and recommendations are welcome. Component improvements and additions are approved by Web3D Consortium members.&lt;br /&gt;
&lt;br /&gt;
Please [http://www.web3d.org/realtime-3d/contact contact us] if you think additional technologies need to be considered.&lt;br /&gt;
&lt;br /&gt;
== Backwards and forwards compatibility ==&lt;br /&gt;
&lt;br /&gt;
Thanks to careful design and insistence on implementation/evaluation, the X3D International Standard has maintained both steady growth and interoperability ever since Virtual Reality Modeling Language (VRML) in 1997. This track record of stability and innovation is among the best in the 3D graphics industry.&lt;br /&gt;
&lt;br /&gt;
[[X3D version 4.0 Development]] efforts are focused on HTML5/Declarative 3D/X3DOM and Augmented Reality Continuum (ARC) technologies, which may require architectural changes. Some new technologies may get pushed from 4.0 to 3.4 (or back again) after careful consideration by the respective working groups.&lt;br /&gt;
&lt;br /&gt;
*As with all other X3D components, all work is defined in the abstract specification has corresponding file encodings (.x3d .x3dv .x3db) and language bindings (ECMAScript and Java). &lt;br /&gt;
*Compatibility concerns include evolutionary efforts to upgrade the X3D Compressed Binary Encoding (CBE), as described in the [http://www.web3d.org/realtime-3d/working-groups/x3d/compressed-binary/x3d-compressed-binary-encoding-call-contributions X3D Compressed Binary Encoding Call For Contributions].&lt;br /&gt;
*ECMAscript (JavaScript) support in X3D needs to be upgraded to the new standard for that rapidly improving programming language.&lt;br /&gt;
**[http://standards.iso.org/ittf/PubliclyAvailableStandards/c055755_ISO_IEC_16262_2011(E).zip ISO/IEC 16262:2011 Information technology — ECMAScript language specification] (.zip download)&lt;br /&gt;
**Downloadable from [http://standards.iso.org/ittf/PubliclyAvailableStandards/index.html ISO Publicly Available Standards] site without charge&lt;br /&gt;
**This relates to [http://www.web3d.org/files/specifications/19777-1/V3.0/index.html 19777-1 Part 2, X3D Scene Access Interface (SAI) language bindings for EcmaScript]&lt;br /&gt;
&lt;br /&gt;
== Schedule ==&lt;br /&gt;
&lt;br /&gt;
*'''X3D CADInterchange Profile goal.''' Implementations are complete and tested. The [http://www.web3d.org/files/specifications/19775-1/V3.3/Part01/CADInterchange.html X3D CADInterchange Profile] was completed as part of X3D version 3.3 during 2013.&lt;br /&gt;
*'''Mobile Profile.''' Calling out a reduced palette for mobile devices remains a potential goal for 2014, but might instead become part of X3D version 4.0 efforts.&lt;br /&gt;
*'''X3D Compressed Binary Encoding (CBE) goal.''' This work is proceeding in parallel.&lt;br /&gt;
*'''X3D version 3.4 goal.''' Review progress during SIGGRAPH 2014, continue work in parallel with X3D version 4.0. Web3D Consortium members decide when a draft specification proceeds to ISO.&lt;/div&gt;</summary>
		<author><name>Yjung</name></author>	</entry>

	<entry>
		<id>https://old.web3d.org/wiki/index.php?title=X3D_version_3.4_Development&amp;diff=8510</id>
		<title>X3D version 3.4 Development</title>
		<link rel="alternate" type="text/html" href="https://old.web3d.org/wiki/index.php?title=X3D_version_3.4_Development&amp;diff=8510"/>
				<updated>2014-08-08T18:45:32Z</updated>
		
		<summary type="html">&lt;p&gt;Yjung: /* Candidate capabilities */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;__TOC__&lt;br /&gt;
== Strategic overview ==&lt;br /&gt;
&lt;br /&gt;
[[X3D version 3.4 Development]] efforts are evolutionary improvements to the widely proven X3D Graphics architecture.&lt;br /&gt;
&lt;br /&gt;
Web3D Consortium working groups currently define specification goals and requirements. Working group efforts are often the focus for defining and testing new X3D components.&lt;br /&gt;
&lt;br /&gt;
We publicly review these goals annually during [http://www.web3d2014.org Web3D Conference] and [http://s2014.siggraph.org/attendees/birds-feather SIGGRAPH BOF] meetings.&lt;br /&gt;
&lt;br /&gt;
Suggestions, development and discussion via the [http://web3d.org/mailman/listinfo/x3d-public_web3d.org x3d-public mailing list] is ongoing.&lt;br /&gt;
X3D version 3.4 progress also informs and helps to extend [[X3D version 4.0 Development]].&lt;br /&gt;
&lt;br /&gt;
The following list shows that a lot of interesting capabilities have been proposed and are under way for X3D version 3.4. However, topics on this list are not guaranteed to be completed! Rather these are all works in progress.&lt;br /&gt;
&lt;br /&gt;
Activity and approval proceeds based on technical contributions and Web3D Consortium Member priorities. Please consider [http://web3d.org/membership/join joining Web3D] to help advance 3D graphics on the Web.&lt;br /&gt;
&lt;br /&gt;
== Candidate capabilities ==&lt;br /&gt;
&lt;br /&gt;
Each of the following possibilities for X3D 3.4 have been discussed by the various X3D working groups during meetings and on mailing lists.&lt;br /&gt;
Each potential capability is considered to be a feasible (and in most cases, straightforward) addition to the existing X3D version 3.3 architecture.&lt;br /&gt;
&lt;br /&gt;
*'''Appearance'''&lt;br /&gt;
**'''Materials''': advanced parameters&lt;br /&gt;
**[[X3D Multitexture | Multitexture]]: review for correctness, completeness and conformance of rendering example scenes&lt;br /&gt;
**'''Rendering''': bump maps, shadows, edge smoothing&lt;br /&gt;
**[http://www.web3d.org/files/specifications/19775-1/V3.3/Part01/components/shaders.html Shaders]: improved support and better interoperability, library of examples&lt;br /&gt;
**'''Texturing''': [http://en.wikipedia.org/wiki/Texture_atlas Texture atlas], [http://en.wikipedia.org/wiki/Projective_texture_mapping projective texture mapping (PTM)], [http://www.xj3d.org/extensions/render_texture.html RenderedTexture node] (for multipass rendering (2d texture version of GeneratedCubeMapTexture), first proposed by Xj3D, impl. in X3DOM and InstantReality, useful for all kinds of NPR, shadows, mirrors, etc.), and required or recommended formats for imagery and video (.gif .bmp .svg .flv .exr .hdr etc.)&lt;br /&gt;
*'''Audio and video''': adding royalty-free formats, streamability, [http://web3d.org/pipermail/x3d-public_web3d.org/2013-December/002681.html disabling attenuation], 3D aural spatialization using reflection from simple geometry (such as [http://gamma.cs.unc.edu/Sound/RESound RESOUND] or [https://dvcs.w3.org/hg/audio/raw-file/tip/webaudio/specification.html Web Audio API])&lt;br /&gt;
*'''[http://www.web3d.org/realtime-3d/working-groups/computer-aided-design-cad Computer Aided Design (CAD)]''' Interactive/Mobile Profile, to include:&lt;br /&gt;
**[http://www.web3d.org/files/specifications/19775-1/V3.3/Part01/CADInterchange.html CADInterchange profile] plus FillProperties/LineProperties, primitive/Geometry2D nodes, Extrusion, NURBS, ClipPlane&lt;br /&gt;
**Part selection/animation, 3D printing, [http://www.web3d.org/realtime-3d/news/3d-graphics-compress-call-contributions Compressed Binary Encoding (CBE)], possibly [http://svn.xj3d.org/xj3d_website/trunk/extensions/annotation.html annotations component]&lt;br /&gt;
** Building Information Models (BIM), Architecture Engineering Construction (AEC), Physical Sensors&lt;br /&gt;
*'''[http://www.ecma-international.org/publications/standards/Ecma-262.htm ECMAScript]''' (Javascript) specification revision compatibility with [http://www.web3d.org/files/specifications/19777-1/V3.0/index.html X3D scripting]; possibly add C# or Python support&lt;br /&gt;
*'''Generalized input/output interface support'''&lt;br /&gt;
**Possibly [http://www.cs.unc.edu/Research/vrpn/index.html Virtual Reality Peripheral Network (VRPN)], gesture recognition (such as [http://en.wikipedia.org/wiki/Kinect KINECT], [https://www.leapmotion.com LEAP]), etc.&lt;br /&gt;
** Support for arbitrary sensors and user interaction devices&lt;br /&gt;
* '''Geometry''': point size (or perspective rendering), progressive meshes (suitable for both compression and streaming), 3D ExtrudedText, support for [https://en.wikipedia.org/wiki/Web_typography Web typography] using [http://www.w3.org/TR/WOFF Web Open Fonts Format (WOFF)]&lt;br /&gt;
*'''[http://www.web3d.org/realtime-3d/working-groups/x3d-earth Geospatial X3D]''' component: [http://www.igraphics.com/Standards/EnhancedGeospatialComponent_2007_10_30/Part01/X3D.html spatial reference frame (SRF)] and [http://www.opengeospatial.org/standards/kml KML] support, [http://www.opengeospatial.org/projects/initiatives/3dpie OGC 3D Portrayal], [http://web3d.org/pipermail/x3d-public_web3d.org/2010-December/001187.html GpsSensor], [http://openlayers.org OpenLayer] mashups&lt;br /&gt;
*'''[http://www.web3d.org/realtime-3d/working-groups/h-anim Humanoid Animation (H-Anim)]''' anatomical correctness for skeleton and skinning, motion capture and playback, interchangeable avatars, animation for hands feet and faces&lt;br /&gt;
* '''Interoperability''': include ''class'' attribute for all nodes to all encodings&lt;br /&gt;
* '''[http://www.json.org JSON]''': JavaScript Object Notation as an X3D encoding ([http://web3d.org/pipermail/x3d-public_web3d.org/2014-July/thread.html#2854 assessment thread]), relation to [https://www.khronos.org/gltf GlTF], streaming considerations&lt;br /&gt;
*'''[http://www.web3d.org/realtime-3d/working-groups/medx3d Medical working group]''' capabilities&lt;br /&gt;
** [http://svn.xj3d.org/xj3d_website/trunk/extensions/annotation.html Annotations component] and metadata usage&lt;br /&gt;
** Archival 3D medical records, potential emphasis on [http://en.wikipedia.org/wiki/Traumatic_brain_injury Traumatic brain injury (TBI)] volume visualization&lt;br /&gt;
** Haptics component for force feedback&lt;br /&gt;
** Soft-body physics component to complement rigid-body physics component&lt;br /&gt;
* '''Mixed and Augmented Reality (MAR)''': integration of multiple capabilities with mobile devices&lt;br /&gt;
*'''Networking''': consider [http://www.web3d.org/x3d/content/examples/Basic/Networking NetworkSensor] and event-passing issues, streaming using [http://www.json.org JSON], server-side 3D topics&lt;br /&gt;
*'''Security and privacy''':&lt;br /&gt;
** [http://www.w3.org/standards/xml/security XML Security] provides best-available encryption, digital signature (authentication)&lt;br /&gt;
** [http://www.w3.org/standards/webdesign/privacy Web Privacy]: examine X3D compatibility with Do Not Track, P3P, POWDER&lt;br /&gt;
** Review X3D specifications to ensure that Security Considerations are fully documented&lt;br /&gt;
*'''Viewing and navigation''': cinematic camera control, alternative navigation types (such as PAN, [http://www.x3dom.org/?p=3536 TURNTABLE] etc.), [http://www.web3d.org/files/specifications/19775-1/V3.3/Part01/behaviours.html Recommended navigation behaviours] review, and MatrixTransform node (esp. useful for CAD, VR/AR etc., impl. in X3DOM and InstantREality)&lt;br /&gt;
&lt;br /&gt;
All suggestions and recommendations are welcome. Component improvements and additions are approved by Web3D Consortium members.&lt;br /&gt;
&lt;br /&gt;
Please [http://www.web3d.org/realtime-3d/contact contact us] if you think additional technologies need to be considered.&lt;br /&gt;
&lt;br /&gt;
== Backwards and forwards compatibility ==&lt;br /&gt;
&lt;br /&gt;
Thanks to careful design and insistence on implementation/evaluation, the X3D International Standard has maintained both steady growth and interoperability ever since Virtual Reality Modeling Language (VRML) in 1997. This track record of stability and innovation is among the best in the 3D graphics industry.&lt;br /&gt;
&lt;br /&gt;
[[X3D version 4.0 Development]] efforts are focused on HTML5/Declarative 3D/X3DOM and Augmented Reality Continuum (ARC) technologies, which may require architectural changes. Some new technologies may get pushed from 4.0 to 3.4 (or back again) after careful consideration by the respective working groups.&lt;br /&gt;
&lt;br /&gt;
*As with all other X3D components, all work is defined in the abstract specification has corresponding file encodings (.x3d .x3dv .x3db) and language bindings (ECMAScript and Java). &lt;br /&gt;
*Compatibility concerns include evolutionary efforts to upgrade the X3D Compressed Binary Encoding (CBE), as described in the [http://www.web3d.org/realtime-3d/working-groups/x3d/compressed-binary/x3d-compressed-binary-encoding-call-contributions X3D Compressed Binary Encoding Call For Contributions].&lt;br /&gt;
*ECMAscript (JavaScript) support in X3D needs to be upgraded to the new standard for that rapidly improving programming language.&lt;br /&gt;
**[http://standards.iso.org/ittf/PubliclyAvailableStandards/c055755_ISO_IEC_16262_2011(E).zip ISO/IEC 16262:2011 Information technology — ECMAScript language specification] (.zip download)&lt;br /&gt;
**Downloadable from [http://standards.iso.org/ittf/PubliclyAvailableStandards/index.html ISO Publicly Available Standards] site without charge&lt;br /&gt;
**This relates to [http://www.web3d.org/files/specifications/19777-1/V3.0/index.html 19777-1 Part 2, X3D Scene Access Interface (SAI) language bindings for EcmaScript]&lt;br /&gt;
&lt;br /&gt;
== Schedule ==&lt;br /&gt;
&lt;br /&gt;
*'''X3D CADInterchange Profile goal.''' Implementations are complete and tested. The [http://www.web3d.org/files/specifications/19775-1/V3.3/Part01/CADInterchange.html X3D CADInterchange Profile] was completed as part of X3D version 3.3 during 2013.&lt;br /&gt;
*'''Mobile Profile.''' Calling out a reduced palette for mobile devices remains a potential goal for 2014, but might instead become part of X3D version 4.0 efforts.&lt;br /&gt;
*'''X3D Compressed Binary Encoding (CBE) goal.''' This work is proceeding in parallel.&lt;br /&gt;
*'''X3D version 3.4 goal.''' Review progress during SIGGRAPH 2014, continue work in parallel with X3D version 4.0. Web3D Consortium members decide when a draft specification proceeds to ISO.&lt;/div&gt;</summary>
		<author><name>Yjung</name></author>	</entry>

	<entry>
		<id>https://old.web3d.org/wiki/index.php?title=X3D_version_3.4_Development&amp;diff=8509</id>
		<title>X3D version 3.4 Development</title>
		<link rel="alternate" type="text/html" href="https://old.web3d.org/wiki/index.php?title=X3D_version_3.4_Development&amp;diff=8509"/>
				<updated>2014-08-08T18:44:23Z</updated>
		
		<summary type="html">&lt;p&gt;Yjung: /* Candidate capabilities */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;__TOC__&lt;br /&gt;
== Strategic overview ==&lt;br /&gt;
&lt;br /&gt;
[[X3D version 3.4 Development]] efforts are evolutionary improvements to the widely proven X3D Graphics architecture.&lt;br /&gt;
&lt;br /&gt;
Web3D Consortium working groups currently define specification goals and requirements. Working group efforts are often the focus for defining and testing new X3D components.&lt;br /&gt;
&lt;br /&gt;
We publicly review these goals annually during [http://www.web3d2014.org Web3D Conference] and [http://s2014.siggraph.org/attendees/birds-feather SIGGRAPH BOF] meetings.&lt;br /&gt;
&lt;br /&gt;
Suggestions, development and discussion via the [http://web3d.org/mailman/listinfo/x3d-public_web3d.org x3d-public mailing list] is ongoing.&lt;br /&gt;
X3D version 3.4 progress also informs and helps to extend [[X3D version 4.0 Development]].&lt;br /&gt;
&lt;br /&gt;
The following list shows that a lot of interesting capabilities have been proposed and are under way for X3D version 3.4. However, topics on this list are not guaranteed to be completed! Rather these are all works in progress.&lt;br /&gt;
&lt;br /&gt;
Activity and approval proceeds based on technical contributions and Web3D Consortium Member priorities. Please consider [http://web3d.org/membership/join joining Web3D] to help advance 3D graphics on the Web.&lt;br /&gt;
&lt;br /&gt;
== Candidate capabilities ==&lt;br /&gt;
&lt;br /&gt;
Each of the following possibilities for X3D 3.4 have been discussed by the various X3D working groups during meetings and on mailing lists.&lt;br /&gt;
Each potential capability is considered to be a feasible (and in most cases, straightforward) addition to the existing X3D version 3.3 architecture.&lt;br /&gt;
&lt;br /&gt;
*'''Appearance'''&lt;br /&gt;
**'''Materials''': advanced parameters&lt;br /&gt;
**[[X3D Multitexture | Multitexture]]: review for correctness, completeness and conformance of rendering example scenes&lt;br /&gt;
**'''Rendering''': bump maps, shadows, edge smoothing&lt;br /&gt;
**[http://www.web3d.org/files/specifications/19775-1/V3.3/Part01/components/shaders.html Shaders]: improved support and better interoperability, library of examples&lt;br /&gt;
**'''Texturing''': [http://en.wikipedia.org/wiki/Texture_atlas Texture atlas], [http://en.wikipedia.org/wiki/Projective_texture_mapping projective texture mapping (PTM)], [http://www.xj3d.org/extensions/render_texture.html RenderedTexture node] (for multipass rendering (2d texture version of GeneratedCubeMapTexture), first proposed by Xj3D, impl. in X3DOM and InstantReality, useful for all kinds of NPR, shadows, mirrors, etc.), and required or recommended formats for imagery and video (.gif .bmp .svg .flv etc.)&lt;br /&gt;
*'''Audio and video''': adding royalty-free formats, streamability, [http://web3d.org/pipermail/x3d-public_web3d.org/2013-December/002681.html disabling attenuation], 3D aural spatialization using reflection from simple geometry (such as [http://gamma.cs.unc.edu/Sound/RESound RESOUND] or [https://dvcs.w3.org/hg/audio/raw-file/tip/webaudio/specification.html Web Audio API])&lt;br /&gt;
*'''[http://www.web3d.org/realtime-3d/working-groups/computer-aided-design-cad Computer Aided Design (CAD)]''' Interactive/Mobile Profile, to include:&lt;br /&gt;
**[http://www.web3d.org/files/specifications/19775-1/V3.3/Part01/CADInterchange.html CADInterchange profile] plus FillProperties/LineProperties, primitive/Geometry2D nodes, Extrusion, NURBS, ClipPlane&lt;br /&gt;
**Part selection/animation, 3D printing, [http://www.web3d.org/realtime-3d/news/3d-graphics-compress-call-contributions Compressed Binary Encoding (CBE)], possibly [http://svn.xj3d.org/xj3d_website/trunk/extensions/annotation.html annotations component]&lt;br /&gt;
** Building Information Models (BIM), Architecture Engineering Construction (AEC), Physical Sensors&lt;br /&gt;
*'''[http://www.ecma-international.org/publications/standards/Ecma-262.htm ECMAScript]''' (Javascript) specification revision compatibility with [http://www.web3d.org/files/specifications/19777-1/V3.0/index.html X3D scripting]; possibly add C# or Python support&lt;br /&gt;
*'''Generalized input/output interface support'''&lt;br /&gt;
**Possibly [http://www.cs.unc.edu/Research/vrpn/index.html Virtual Reality Peripheral Network (VRPN)], gesture recognition (such as [http://en.wikipedia.org/wiki/Kinect KINECT], [https://www.leapmotion.com LEAP]), etc.&lt;br /&gt;
** Support for arbitrary sensors and user interaction devices&lt;br /&gt;
* '''Geometry''': point size (or perspective rendering), progressive meshes (suitable for both compression and streaming), 3D ExtrudedText, support for [https://en.wikipedia.org/wiki/Web_typography Web typography] using [http://www.w3.org/TR/WOFF Web Open Fonts Format (WOFF)]&lt;br /&gt;
*'''[http://www.web3d.org/realtime-3d/working-groups/x3d-earth Geospatial X3D]''' component: [http://www.igraphics.com/Standards/EnhancedGeospatialComponent_2007_10_30/Part01/X3D.html spatial reference frame (SRF)] and [http://www.opengeospatial.org/standards/kml KML] support, [http://www.opengeospatial.org/projects/initiatives/3dpie OGC 3D Portrayal], [http://web3d.org/pipermail/x3d-public_web3d.org/2010-December/001187.html GpsSensor], [http://openlayers.org OpenLayer] mashups&lt;br /&gt;
*'''[http://www.web3d.org/realtime-3d/working-groups/h-anim Humanoid Animation (H-Anim)]''' anatomical correctness for skeleton and skinning, motion capture and playback, interchangeable avatars, animation for hands feet and faces&lt;br /&gt;
* '''Interoperability''': include ''class'' attribute for all nodes to all encodings&lt;br /&gt;
* '''[http://www.json.org JSON]''': JavaScript Object Notation as an X3D encoding ([http://web3d.org/pipermail/x3d-public_web3d.org/2014-July/thread.html#2854 assessment thread]), relation to [https://www.khronos.org/gltf GlTF], streaming considerations&lt;br /&gt;
*'''[http://www.web3d.org/realtime-3d/working-groups/medx3d Medical working group]''' capabilities&lt;br /&gt;
** [http://svn.xj3d.org/xj3d_website/trunk/extensions/annotation.html Annotations component] and metadata usage&lt;br /&gt;
** Archival 3D medical records, potential emphasis on [http://en.wikipedia.org/wiki/Traumatic_brain_injury Traumatic brain injury (TBI)] volume visualization&lt;br /&gt;
** Haptics component for force feedback&lt;br /&gt;
** Soft-body physics component to complement rigid-body physics component&lt;br /&gt;
* '''Mixed and Augmented Reality (MAR)''': integration of multiple capabilities with mobile devices&lt;br /&gt;
*'''Networking''': consider [http://www.web3d.org/x3d/content/examples/Basic/Networking NetworkSensor] and event-passing issues, streaming using [http://www.json.org JSON], server-side 3D topics&lt;br /&gt;
*'''Security and privacy''':&lt;br /&gt;
** [http://www.w3.org/standards/xml/security XML Security] provides best-available encryption, digital signature (authentication)&lt;br /&gt;
** [http://www.w3.org/standards/webdesign/privacy Web Privacy]: examine X3D compatibility with Do Not Track, P3P, POWDER&lt;br /&gt;
** Review X3D specifications to ensure that Security Considerations are fully documented&lt;br /&gt;
*'''Viewing and navigation''': cinematic camera control, alternative navigation types (such as PAN, [http://www.x3dom.org/?p=3536 TURNTABLE] etc.), [http://www.web3d.org/files/specifications/19775-1/V3.3/Part01/behaviours.html Recommended navigation behaviours] review, and MatrixTransform node (esp. useful for CAD, VR/AR etc., impl. in X3DOM and InstantREality)&lt;br /&gt;
&lt;br /&gt;
All suggestions and recommendations are welcome. Component improvements and additions are approved by Web3D Consortium members.&lt;br /&gt;
&lt;br /&gt;
Please [http://www.web3d.org/realtime-3d/contact contact us] if you think additional technologies need to be considered.&lt;br /&gt;
&lt;br /&gt;
== Backwards and forwards compatibility ==&lt;br /&gt;
&lt;br /&gt;
Thanks to careful design and insistence on implementation/evaluation, the X3D International Standard has maintained both steady growth and interoperability ever since Virtual Reality Modeling Language (VRML) in 1997. This track record of stability and innovation is among the best in the 3D graphics industry.&lt;br /&gt;
&lt;br /&gt;
[[X3D version 4.0 Development]] efforts are focused on HTML5/Declarative 3D/X3DOM and Augmented Reality Continuum (ARC) technologies, which may require architectural changes. Some new technologies may get pushed from 4.0 to 3.4 (or back again) after careful consideration by the respective working groups.&lt;br /&gt;
&lt;br /&gt;
*As with all other X3D components, all work is defined in the abstract specification has corresponding file encodings (.x3d .x3dv .x3db) and language bindings (ECMAScript and Java). &lt;br /&gt;
*Compatibility concerns include evolutionary efforts to upgrade the X3D Compressed Binary Encoding (CBE), as described in the [http://www.web3d.org/realtime-3d/working-groups/x3d/compressed-binary/x3d-compressed-binary-encoding-call-contributions X3D Compressed Binary Encoding Call For Contributions].&lt;br /&gt;
*ECMAscript (JavaScript) support in X3D needs to be upgraded to the new standard for that rapidly improving programming language.&lt;br /&gt;
**[http://standards.iso.org/ittf/PubliclyAvailableStandards/c055755_ISO_IEC_16262_2011(E).zip ISO/IEC 16262:2011 Information technology — ECMAScript language specification] (.zip download)&lt;br /&gt;
**Downloadable from [http://standards.iso.org/ittf/PubliclyAvailableStandards/index.html ISO Publicly Available Standards] site without charge&lt;br /&gt;
**This relates to [http://www.web3d.org/files/specifications/19777-1/V3.0/index.html 19777-1 Part 2, X3D Scene Access Interface (SAI) language bindings for EcmaScript]&lt;br /&gt;
&lt;br /&gt;
== Schedule ==&lt;br /&gt;
&lt;br /&gt;
*'''X3D CADInterchange Profile goal.''' Implementations are complete and tested. The [http://www.web3d.org/files/specifications/19775-1/V3.3/Part01/CADInterchange.html X3D CADInterchange Profile] was completed as part of X3D version 3.3 during 2013.&lt;br /&gt;
*'''Mobile Profile.''' Calling out a reduced palette for mobile devices remains a potential goal for 2014, but might instead become part of X3D version 4.0 efforts.&lt;br /&gt;
*'''X3D Compressed Binary Encoding (CBE) goal.''' This work is proceeding in parallel.&lt;br /&gt;
*'''X3D version 3.4 goal.''' Review progress during SIGGRAPH 2014, continue work in parallel with X3D version 4.0. Web3D Consortium members decide when a draft specification proceeds to ISO.&lt;/div&gt;</summary>
		<author><name>Yjung</name></author>	</entry>

	<entry>
		<id>https://old.web3d.org/wiki/index.php?title=X3D_version_3.4_Development&amp;diff=8508</id>
		<title>X3D version 3.4 Development</title>
		<link rel="alternate" type="text/html" href="https://old.web3d.org/wiki/index.php?title=X3D_version_3.4_Development&amp;diff=8508"/>
				<updated>2014-08-08T18:43:39Z</updated>
		
		<summary type="html">&lt;p&gt;Yjung: /* Candidate capabilities */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;__TOC__&lt;br /&gt;
== Strategic overview ==&lt;br /&gt;
&lt;br /&gt;
[[X3D version 3.4 Development]] efforts are evolutionary improvements to the widely proven X3D Graphics architecture.&lt;br /&gt;
&lt;br /&gt;
Web3D Consortium working groups currently define specification goals and requirements. Working group efforts are often the focus for defining and testing new X3D components.&lt;br /&gt;
&lt;br /&gt;
We publicly review these goals annually during [http://www.web3d2014.org Web3D Conference] and [http://s2014.siggraph.org/attendees/birds-feather SIGGRAPH BOF] meetings.&lt;br /&gt;
&lt;br /&gt;
Suggestions, development and discussion via the [http://web3d.org/mailman/listinfo/x3d-public_web3d.org x3d-public mailing list] is ongoing.&lt;br /&gt;
X3D version 3.4 progress also informs and helps to extend [[X3D version 4.0 Development]].&lt;br /&gt;
&lt;br /&gt;
The following list shows that a lot of interesting capabilities have been proposed and are under way for X3D version 3.4. However, topics on this list are not guaranteed to be completed! Rather these are all works in progress.&lt;br /&gt;
&lt;br /&gt;
Activity and approval proceeds based on technical contributions and Web3D Consortium Member priorities. Please consider [http://web3d.org/membership/join joining Web3D] to help advance 3D graphics on the Web.&lt;br /&gt;
&lt;br /&gt;
== Candidate capabilities ==&lt;br /&gt;
&lt;br /&gt;
Each of the following possibilities for X3D 3.4 have been discussed by the various X3D working groups during meetings and on mailing lists.&lt;br /&gt;
Each potential capability is considered to be a feasible (and in most cases, straightforward) addition to the existing X3D version 3.3 architecture.&lt;br /&gt;
&lt;br /&gt;
*'''Appearance'''&lt;br /&gt;
**'''Materials''': advanced parameters&lt;br /&gt;
**[[X3D Multitexture | Multitexture]]: review for correctness, completeness and conformance of rendering example scenes&lt;br /&gt;
**'''Rendering''': bump maps, shadows, edge smoothing&lt;br /&gt;
**[http://www.web3d.org/files/specifications/19775-1/V3.3/Part01/components/shaders.html Shaders]: improved support and better interoperability, library of examples&lt;br /&gt;
**'''Texturing''': [http://en.wikipedia.org/wiki/Texture_atlas Texture atlas], [http://en.wikipedia.org/wiki/Projective_texture_mapping projective texture mapping (PTM)], [http://www.xj3d.org/extensions/render_texture.html RenderedTexture node] (for multipass rendering (2d texture version of GeneratedCubeMapTexture), first proposed by Xj3D, impl. in X3DOM and InstantReality, useful for all kinds of NPR, shadows, mirrors, etc.), and required or recommended formats for imagery and video (.gif .bmp .svg .flv etc.)&lt;br /&gt;
*'''Audio and video''': adding royalty-free formats, streamability, [http://web3d.org/pipermail/x3d-public_web3d.org/2013-December/002681.html disabling attenuation], 3D aural spatialization using reflection from simple geometry (such as [http://gamma.cs.unc.edu/Sound/RESound RESOUND] or [https://dvcs.w3.org/hg/audio/raw-file/tip/webaudio/specification.html Web Audio API])&lt;br /&gt;
*'''[http://www.web3d.org/realtime-3d/working-groups/computer-aided-design-cad Computer Aided Design (CAD)]''' Interactive/Mobile Profile, to include:&lt;br /&gt;
**[http://www.web3d.org/files/specifications/19775-1/V3.3/Part01/CADInterchange.html CADInterchange profile] plus FillProperties/LineProperties, primitive/Geometry2D nodes, Extrusion, NURBS, ClipPlane&lt;br /&gt;
**Part selection/animation, 3D printing, [http://www.web3d.org/realtime-3d/news/3d-graphics-compress-call-contributions Compressed Binary Encoding (CBE)], possibly [http://svn.xj3d.org/xj3d_website/trunk/extensions/annotation.html annotations component]&lt;br /&gt;
** Building Information Models (BIM), Architecture Engineering Construction (AEC), Physical Sensors&lt;br /&gt;
*'''[http://www.ecma-international.org/publications/standards/Ecma-262.htm ECMAScript]''' (Javascript) specification revision compatibility with [http://www.web3d.org/files/specifications/19777-1/V3.0/index.html X3D scripting]; possibly add C# or Python support&lt;br /&gt;
*'''Generalized input/output interface support'''&lt;br /&gt;
**Possibly [http://www.cs.unc.edu/Research/vrpn/index.html Virtual Reality Peripheral Network (VRPN)], gesture recognition (such as [http://en.wikipedia.org/wiki/Kinect KINECT], [https://www.leapmotion.com LEAP]), etc.&lt;br /&gt;
** Support for arbitrary sensors and user interaction devices&lt;br /&gt;
* '''Geometry''': point size (or perspective rendering), progressive meshes (suitable for both compression and streaming), 3D ExtrudedText, support for [https://en.wikipedia.org/wiki/Web_typography Web typography] using [http://www.w3.org/TR/WOFF Web Open Fonts Format (WOFF)]&lt;br /&gt;
*'''[http://www.web3d.org/realtime-3d/working-groups/x3d-earth Geospatial X3D]''' component: [http://www.igraphics.com/Standards/EnhancedGeospatialComponent_2007_10_30/Part01/X3D.html spatial reference frame (SRF)] and [http://www.opengeospatial.org/standards/kml KML] support, [http://www.opengeospatial.org/projects/initiatives/3dpie OGC 3D Portrayal], [http://web3d.org/pipermail/x3d-public_web3d.org/2010-December/001187.html GpsSensor], [http://openlayers.org OpenLayer] mashups&lt;br /&gt;
*'''[http://www.web3d.org/realtime-3d/working-groups/h-anim Humanoid Animation (H-Anim)]''' anatomical correctness for skeleton and skinning, motion capture and playback, interchangeable avatars, animation for hands feet and faces&lt;br /&gt;
* '''Interoperability''': include ''class'' attribute for all nodes to all encodings&lt;br /&gt;
* '''[http://www.json.org JSON]''': JavaScript Object Notation as an X3D encoding ([http://web3d.org/pipermail/x3d-public_web3d.org/2014-July/thread.html#2854 assessment thread]), relation to [https://www.khronos.org/gltf GlTF], streaming considerations&lt;br /&gt;
*'''[http://www.web3d.org/realtime-3d/working-groups/medx3d Medical working group]''' capabilities&lt;br /&gt;
** [http://svn.xj3d.org/xj3d_website/trunk/extensions/annotation.html Annotations component] and metadata usage&lt;br /&gt;
** Archival 3D medical records, potential emphasis on [http://en.wikipedia.org/wiki/Traumatic_brain_injury Traumatic brain injury (TBI)] volume visualization&lt;br /&gt;
** Haptics component for force feedback&lt;br /&gt;
** Soft-body physics component to complement rigid-body physics component&lt;br /&gt;
* '''Mixed and Augmented Reality (MAR)''': integration of multiple capabilities with mobile devices&lt;br /&gt;
*'''Networking''': consider [http://www.web3d.org/x3d/content/examples/Basic/Networking NetworkSensor] and event-passing issues, streaming using [http://www.json.org JSON], server-side 3D topics&lt;br /&gt;
*'''Security and privacy''':&lt;br /&gt;
** [http://www.w3.org/standards/xml/security XML Security] provides best-available encryption, digital signature (authentication)&lt;br /&gt;
** [http://www.w3.org/standards/webdesign/privacy Web Privacy]: examine X3D compatibility with Do Not Track, P3P, POWDER&lt;br /&gt;
** Review X3D specifications to ensure that Security Considerations are fully documented&lt;br /&gt;
*'''Viewing and navigation''': cinematic camera control, alternative navigation types (such as PAN, [http://www.x3dom.org/?p=3536 TURNTABLE] etc.), [http://www.web3d.org/files/specifications/19775-1/V3.3/Part01/behaviours.html Recommended navigation behaviours] review, and MatrixTransform node (esp. useful for AR etc.)&lt;br /&gt;
&lt;br /&gt;
All suggestions and recommendations are welcome. Component improvements and additions are approved by Web3D Consortium members.&lt;br /&gt;
&lt;br /&gt;
Please [http://www.web3d.org/realtime-3d/contact contact us] if you think additional technologies need to be considered.&lt;br /&gt;
&lt;br /&gt;
== Backwards and forwards compatibility ==&lt;br /&gt;
&lt;br /&gt;
Thanks to careful design and insistence on implementation/evaluation, the X3D International Standard has maintained both steady growth and interoperability ever since Virtual Reality Modeling Language (VRML) in 1997. This track record of stability and innovation is among the best in the 3D graphics industry.&lt;br /&gt;
&lt;br /&gt;
[[X3D version 4.0 Development]] efforts are focused on HTML5/Declarative 3D/X3DOM and Augmented Reality Continuum (ARC) technologies, which may require architectural changes. Some new technologies may get pushed from 4.0 to 3.4 (or back again) after careful consideration by the respective working groups.&lt;br /&gt;
&lt;br /&gt;
*As with all other X3D components, all work is defined in the abstract specification has corresponding file encodings (.x3d .x3dv .x3db) and language bindings (ECMAScript and Java). &lt;br /&gt;
*Compatibility concerns include evolutionary efforts to upgrade the X3D Compressed Binary Encoding (CBE), as described in the [http://www.web3d.org/realtime-3d/working-groups/x3d/compressed-binary/x3d-compressed-binary-encoding-call-contributions X3D Compressed Binary Encoding Call For Contributions].&lt;br /&gt;
*ECMAscript (JavaScript) support in X3D needs to be upgraded to the new standard for that rapidly improving programming language.&lt;br /&gt;
**[http://standards.iso.org/ittf/PubliclyAvailableStandards/c055755_ISO_IEC_16262_2011(E).zip ISO/IEC 16262:2011 Information technology — ECMAScript language specification] (.zip download)&lt;br /&gt;
**Downloadable from [http://standards.iso.org/ittf/PubliclyAvailableStandards/index.html ISO Publicly Available Standards] site without charge&lt;br /&gt;
**This relates to [http://www.web3d.org/files/specifications/19777-1/V3.0/index.html 19777-1 Part 2, X3D Scene Access Interface (SAI) language bindings for EcmaScript]&lt;br /&gt;
&lt;br /&gt;
== Schedule ==&lt;br /&gt;
&lt;br /&gt;
*'''X3D CADInterchange Profile goal.''' Implementations are complete and tested. The [http://www.web3d.org/files/specifications/19775-1/V3.3/Part01/CADInterchange.html X3D CADInterchange Profile] was completed as part of X3D version 3.3 during 2013.&lt;br /&gt;
*'''Mobile Profile.''' Calling out a reduced palette for mobile devices remains a potential goal for 2014, but might instead become part of X3D version 4.0 efforts.&lt;br /&gt;
*'''X3D Compressed Binary Encoding (CBE) goal.''' This work is proceeding in parallel.&lt;br /&gt;
*'''X3D version 3.4 goal.''' Review progress during SIGGRAPH 2014, continue work in parallel with X3D version 4.0. Web3D Consortium members decide when a draft specification proceeds to ISO.&lt;/div&gt;</summary>
		<author><name>Yjung</name></author>	</entry>

	<entry>
		<id>https://old.web3d.org/wiki/index.php?title=X3D_version_3.4_Development&amp;diff=8507</id>
		<title>X3D version 3.4 Development</title>
		<link rel="alternate" type="text/html" href="https://old.web3d.org/wiki/index.php?title=X3D_version_3.4_Development&amp;diff=8507"/>
				<updated>2014-08-08T18:41:54Z</updated>
		
		<summary type="html">&lt;p&gt;Yjung: /* Candidate capabilities */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;__TOC__&lt;br /&gt;
== Strategic overview ==&lt;br /&gt;
&lt;br /&gt;
[[X3D version 3.4 Development]] efforts are evolutionary improvements to the widely proven X3D Graphics architecture.&lt;br /&gt;
&lt;br /&gt;
Web3D Consortium working groups currently define specification goals and requirements. Working group efforts are often the focus for defining and testing new X3D components.&lt;br /&gt;
&lt;br /&gt;
We publicly review these goals annually during [http://www.web3d2014.org Web3D Conference] and [http://s2014.siggraph.org/attendees/birds-feather SIGGRAPH BOF] meetings.&lt;br /&gt;
&lt;br /&gt;
Suggestions, development and discussion via the [http://web3d.org/mailman/listinfo/x3d-public_web3d.org x3d-public mailing list] is ongoing.&lt;br /&gt;
X3D version 3.4 progress also informs and helps to extend [[X3D version 4.0 Development]].&lt;br /&gt;
&lt;br /&gt;
The following list shows that a lot of interesting capabilities have been proposed and are under way for X3D version 3.4. However, topics on this list are not guaranteed to be completed! Rather these are all works in progress.&lt;br /&gt;
&lt;br /&gt;
Activity and approval proceeds based on technical contributions and Web3D Consortium Member priorities. Please consider [http://web3d.org/membership/join joining Web3D] to help advance 3D graphics on the Web.&lt;br /&gt;
&lt;br /&gt;
== Candidate capabilities ==&lt;br /&gt;
&lt;br /&gt;
Each of the following possibilities for X3D 3.4 have been discussed by the various X3D working groups during meetings and on mailing lists.&lt;br /&gt;
Each potential capability is considered to be a feasible (and in most cases, straightforward) addition to the existing X3D version 3.3 architecture.&lt;br /&gt;
&lt;br /&gt;
*'''Appearance'''&lt;br /&gt;
**'''Materials''': advanced parameters&lt;br /&gt;
**[[X3D Multitexture | Multitexture]]: review for correctness, completeness and conformance of rendering example scenes&lt;br /&gt;
**'''Rendering''': bump maps, shadows, edge smoothing&lt;br /&gt;
**[http://www.web3d.org/files/specifications/19775-1/V3.3/Part01/components/shaders.html Shaders]: improved support and better interoperability, library of examples&lt;br /&gt;
**'''Texturing''': [http://en.wikipedia.org/wiki/Texture_atlas Texture atlas], [http://en.wikipedia.org/wiki/Projective_texture_mapping projective texture mapping (PTM)], [http://www.xj3d.org/extensions/render_texture.html RenderedTexture node] (for multipass rendering (2d texture version of GeneratedCubeMapTexture), first proposed by Xj3D, impl. in X3DOM and InstantReality, useful for all kinds of NPR, shadows, mirrors, etc.), and required or recommended formats for imagery and video (.gif .bmp .svg .flv etc.)&lt;br /&gt;
*'''Audio and video''': adding royalty-free formats, streamability, [http://web3d.org/pipermail/x3d-public_web3d.org/2013-December/002681.html disabling attenuation], 3D aural spatialization using reflection from simple geometry (such as [http://gamma.cs.unc.edu/Sound/RESound RESOUND] or [https://dvcs.w3.org/hg/audio/raw-file/tip/webaudio/specification.html Web Audio API])&lt;br /&gt;
*'''[http://www.web3d.org/realtime-3d/working-groups/computer-aided-design-cad Computer Aided Design (CAD)]''' Interactive/Mobile Profile, to include:&lt;br /&gt;
**[http://www.web3d.org/files/specifications/19775-1/V3.3/Part01/CADInterchange.html CADInterchange profile] plus FillProperties/LineProperties, primitive/Geometry2D nodes, Extrusion, NURBS, ClipPlane&lt;br /&gt;
**Part selection/animation, 3D printing, [http://www.web3d.org/realtime-3d/news/3d-graphics-compress-call-contributions Compressed Binary Encoding (CBE)], possibly [http://svn.xj3d.org/xj3d_website/trunk/extensions/annotation.html annotations component]&lt;br /&gt;
** Building Information Models (BIM), Architecture Engineering Construction (AEC), Physical Sensors&lt;br /&gt;
*'''[http://www.ecma-international.org/publications/standards/Ecma-262.htm ECMAScript]''' (Javascript) specification revision compatibility with [http://www.web3d.org/files/specifications/19777-1/V3.0/index.html X3D scripting]; possibly add C# or Python support&lt;br /&gt;
*'''Generalized input/output interface support'''&lt;br /&gt;
**Possibly [http://www.cs.unc.edu/Research/vrpn/index.html Virtual Reality Peripheral Network (VRPN)], gesture recognition (such as [http://en.wikipedia.org/wiki/Kinect KINECT], [https://www.leapmotion.com LEAP]), etc.&lt;br /&gt;
** Support for arbitrary sensors and user interaction devices&lt;br /&gt;
* '''Geometry''': point size (or perspective rendering), progressive meshes (suitable for both compression and streaming), 3D ExtrudedText, support for [https://en.wikipedia.org/wiki/Web_typography Web typography] using [http://www.w3.org/TR/WOFF Web Open Fonts Format (WOFF)]&lt;br /&gt;
*'''[http://www.web3d.org/realtime-3d/working-groups/x3d-earth Geospatial X3D]''' component: [http://www.igraphics.com/Standards/EnhancedGeospatialComponent_2007_10_30/Part01/X3D.html spatial reference frame (SRF)] and [http://www.opengeospatial.org/standards/kml KML] support, [http://www.opengeospatial.org/projects/initiatives/3dpie OGC 3D Portrayal], [http://web3d.org/pipermail/x3d-public_web3d.org/2010-December/001187.html GpsSensor], [http://openlayers.org OpenLayer] mashups&lt;br /&gt;
*'''[http://www.web3d.org/realtime-3d/working-groups/h-anim Humanoid Animation (H-Anim)]''' anatomical correctness for skeleton and skinning, motion capture and playback, interchangeable avatars, animation for hands feet and faces&lt;br /&gt;
* '''Interoperability''': include ''class'' attribute for all nodes to all encodings&lt;br /&gt;
* '''[http://www.json.org JSON]''': JavaScript Object Notation as an X3D encoding ([http://web3d.org/pipermail/x3d-public_web3d.org/2014-July/thread.html#2854 assessment thread]), relation to [https://www.khronos.org/gltf GlTF], streaming considerations&lt;br /&gt;
*'''[http://www.web3d.org/realtime-3d/working-groups/medx3d Medical working group]''' capabilities&lt;br /&gt;
** [http://svn.xj3d.org/xj3d_website/trunk/extensions/annotation.html Annotations component] and metadata usage&lt;br /&gt;
** Archival 3D medical records, potential emphasis on [http://en.wikipedia.org/wiki/Traumatic_brain_injury Traumatic brain injury (TBI)] volume visualization&lt;br /&gt;
** Haptics component for force feedback&lt;br /&gt;
** Soft-body physics component to complement rigid-body physics component&lt;br /&gt;
* '''Mixed and Augmented Reality (MAR)''': integration of multiple capabilities with mobile devices&lt;br /&gt;
*'''Networking''': consider [http://www.web3d.org/x3d/content/examples/Basic/Networking NetworkSensor] and event-passing issues, streaming using [http://www.json.org JSON], server-side 3D topics&lt;br /&gt;
*'''Security and privacy''':&lt;br /&gt;
** [http://www.w3.org/standards/xml/security XML Security] provides best-available encryption, digital signature (authentication)&lt;br /&gt;
** [http://www.w3.org/standards/webdesign/privacy Web Privacy]: examine X3D compatibility with Do Not Track, P3P, POWDER&lt;br /&gt;
** Review X3D specifications to ensure that Security Considerations are fully documented&lt;br /&gt;
*'''Viewing and navigation''': cinematic camera control, alternative navigation types (such as PAN, [http://www.x3dom.org/?p=3536 TURNTABLE] etc.), [http://www.web3d.org/files/specifications/19775-1/V3.3/Part01/behaviours.html Recommended navigation behaviours] review&lt;br /&gt;
&lt;br /&gt;
All suggestions and recommendations are welcome. Component improvements and additions are approved by Web3D Consortium members.&lt;br /&gt;
&lt;br /&gt;
Please [http://www.web3d.org/realtime-3d/contact contact us] if you think additional technologies need to be considered.&lt;br /&gt;
&lt;br /&gt;
== Backwards and forwards compatibility ==&lt;br /&gt;
&lt;br /&gt;
Thanks to careful design and insistence on implementation/evaluation, the X3D International Standard has maintained both steady growth and interoperability ever since Virtual Reality Modeling Language (VRML) in 1997. This track record of stability and innovation is among the best in the 3D graphics industry.&lt;br /&gt;
&lt;br /&gt;
[[X3D version 4.0 Development]] efforts are focused on HTML5/Declarative 3D/X3DOM and Augmented Reality Continuum (ARC) technologies, which may require architectural changes. Some new technologies may get pushed from 4.0 to 3.4 (or back again) after careful consideration by the respective working groups.&lt;br /&gt;
&lt;br /&gt;
*As with all other X3D components, all work is defined in the abstract specification has corresponding file encodings (.x3d .x3dv .x3db) and language bindings (ECMAScript and Java). &lt;br /&gt;
*Compatibility concerns include evolutionary efforts to upgrade the X3D Compressed Binary Encoding (CBE), as described in the [http://www.web3d.org/realtime-3d/working-groups/x3d/compressed-binary/x3d-compressed-binary-encoding-call-contributions X3D Compressed Binary Encoding Call For Contributions].&lt;br /&gt;
*ECMAscript (JavaScript) support in X3D needs to be upgraded to the new standard for that rapidly improving programming language.&lt;br /&gt;
**[http://standards.iso.org/ittf/PubliclyAvailableStandards/c055755_ISO_IEC_16262_2011(E).zip ISO/IEC 16262:2011 Information technology — ECMAScript language specification] (.zip download)&lt;br /&gt;
**Downloadable from [http://standards.iso.org/ittf/PubliclyAvailableStandards/index.html ISO Publicly Available Standards] site without charge&lt;br /&gt;
**This relates to [http://www.web3d.org/files/specifications/19777-1/V3.0/index.html 19777-1 Part 2, X3D Scene Access Interface (SAI) language bindings for EcmaScript]&lt;br /&gt;
&lt;br /&gt;
== Schedule ==&lt;br /&gt;
&lt;br /&gt;
*'''X3D CADInterchange Profile goal.''' Implementations are complete and tested. The [http://www.web3d.org/files/specifications/19775-1/V3.3/Part01/CADInterchange.html X3D CADInterchange Profile] was completed as part of X3D version 3.3 during 2013.&lt;br /&gt;
*'''Mobile Profile.''' Calling out a reduced palette for mobile devices remains a potential goal for 2014, but might instead become part of X3D version 4.0 efforts.&lt;br /&gt;
*'''X3D Compressed Binary Encoding (CBE) goal.''' This work is proceeding in parallel.&lt;br /&gt;
*'''X3D version 3.4 goal.''' Review progress during SIGGRAPH 2014, continue work in parallel with X3D version 4.0. Web3D Consortium members decide when a draft specification proceeds to ISO.&lt;/div&gt;</summary>
		<author><name>Yjung</name></author>	</entry>

	<entry>
		<id>https://old.web3d.org/wiki/index.php?title=X3D_version_3.4_Development&amp;diff=8506</id>
		<title>X3D version 3.4 Development</title>
		<link rel="alternate" type="text/html" href="https://old.web3d.org/wiki/index.php?title=X3D_version_3.4_Development&amp;diff=8506"/>
				<updated>2014-08-08T18:40:20Z</updated>
		
		<summary type="html">&lt;p&gt;Yjung: /* Candidate capabilities */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;__TOC__&lt;br /&gt;
== Strategic overview ==&lt;br /&gt;
&lt;br /&gt;
[[X3D version 3.4 Development]] efforts are evolutionary improvements to the widely proven X3D Graphics architecture.&lt;br /&gt;
&lt;br /&gt;
Web3D Consortium working groups currently define specification goals and requirements. Working group efforts are often the focus for defining and testing new X3D components.&lt;br /&gt;
&lt;br /&gt;
We publicly review these goals annually during [http://www.web3d2014.org Web3D Conference] and [http://s2014.siggraph.org/attendees/birds-feather SIGGRAPH BOF] meetings.&lt;br /&gt;
&lt;br /&gt;
Suggestions, development and discussion via the [http://web3d.org/mailman/listinfo/x3d-public_web3d.org x3d-public mailing list] is ongoing.&lt;br /&gt;
X3D version 3.4 progress also informs and helps to extend [[X3D version 4.0 Development]].&lt;br /&gt;
&lt;br /&gt;
The following list shows that a lot of interesting capabilities have been proposed and are under way for X3D version 3.4. However, topics on this list are not guaranteed to be completed! Rather these are all works in progress.&lt;br /&gt;
&lt;br /&gt;
Activity and approval proceeds based on technical contributions and Web3D Consortium Member priorities. Please consider [http://web3d.org/membership/join joining Web3D] to help advance 3D graphics on the Web.&lt;br /&gt;
&lt;br /&gt;
== Candidate capabilities ==&lt;br /&gt;
&lt;br /&gt;
Each of the following possibilities for X3D 3.4 have been discussed by the various X3D working groups during meetings and on mailing lists.&lt;br /&gt;
Each potential capability is considered to be a feasible (and in most cases, straightforward) addition to the existing X3D version 3.3 architecture.&lt;br /&gt;
&lt;br /&gt;
*'''Appearance'''&lt;br /&gt;
**'''Materials''': advanced parameters&lt;br /&gt;
**[[X3D Multitexture | Multitexture]]: review for correctness, completeness and conformance of rendering example scenes&lt;br /&gt;
**'''Rendering''': bump maps, shadows, edge smoothing&lt;br /&gt;
**[http://www.web3d.org/files/specifications/19775-1/V3.3/Part01/components/shaders.html Shaders]: improved support and better interoperability, library of examples&lt;br /&gt;
**'''Texturing''': [http://en.wikipedia.org/wiki/Texture_atlas Texture atlas], [http://en.wikipedia.org/wiki/Projective_texture_mapping projective texture mapping (PTM)], RenderedTexture (node for multipass rendering (2d texture version of GeneratedCubeMapTexture), first proposed by Xj3D, impl. in X3DOM and InstantReality, useful for all kinds of NPR, shadows, mirrors, etc.), and required or recommended formats for imagery and video (.gif .bmp .svg .flv etc.)&lt;br /&gt;
*'''Audio and video''': adding royalty-free formats, streamability, [http://web3d.org/pipermail/x3d-public_web3d.org/2013-December/002681.html disabling attenuation], 3D aural spatialization using reflection from simple geometry (such as [http://gamma.cs.unc.edu/Sound/RESound RESOUND] or [https://dvcs.w3.org/hg/audio/raw-file/tip/webaudio/specification.html Web Audio API])&lt;br /&gt;
*'''[http://www.web3d.org/realtime-3d/working-groups/computer-aided-design-cad Computer Aided Design (CAD)]''' Interactive/Mobile Profile, to include:&lt;br /&gt;
**[http://www.web3d.org/files/specifications/19775-1/V3.3/Part01/CADInterchange.html CADInterchange profile] plus FillProperties/LineProperties, primitive/Geometry2D nodes, Extrusion, NURBS, ClipPlane&lt;br /&gt;
**Part selection/animation, 3D printing, [http://www.web3d.org/realtime-3d/news/3d-graphics-compress-call-contributions Compressed Binary Encoding (CBE)], possibly [http://svn.xj3d.org/xj3d_website/trunk/extensions/annotation.html annotations component]&lt;br /&gt;
** Building Information Models (BIM), Architecture Engineering Construction (AEC), Physical Sensors&lt;br /&gt;
*'''[http://www.ecma-international.org/publications/standards/Ecma-262.htm ECMAScript]''' (Javascript) specification revision compatibility with [http://www.web3d.org/files/specifications/19777-1/V3.0/index.html X3D scripting]; possibly add C# or Python support&lt;br /&gt;
*'''Generalized input/output interface support'''&lt;br /&gt;
**Possibly [http://www.cs.unc.edu/Research/vrpn/index.html Virtual Reality Peripheral Network (VRPN)], gesture recognition (such as [http://en.wikipedia.org/wiki/Kinect KINECT], [https://www.leapmotion.com LEAP]), etc.&lt;br /&gt;
** Support for arbitrary sensors and user interaction devices&lt;br /&gt;
* '''Geometry''': point size (or perspective rendering), progressive meshes (suitable for both compression and streaming), 3D ExtrudedText, support for [https://en.wikipedia.org/wiki/Web_typography Web typography] using [http://www.w3.org/TR/WOFF Web Open Fonts Format (WOFF)]&lt;br /&gt;
*'''[http://www.web3d.org/realtime-3d/working-groups/x3d-earth Geospatial X3D]''' component: [http://www.igraphics.com/Standards/EnhancedGeospatialComponent_2007_10_30/Part01/X3D.html spatial reference frame (SRF)] and [http://www.opengeospatial.org/standards/kml KML] support, [http://www.opengeospatial.org/projects/initiatives/3dpie OGC 3D Portrayal], [http://web3d.org/pipermail/x3d-public_web3d.org/2010-December/001187.html GpsSensor], [http://openlayers.org OpenLayer] mashups&lt;br /&gt;
*'''[http://www.web3d.org/realtime-3d/working-groups/h-anim Humanoid Animation (H-Anim)]''' anatomical correctness for skeleton and skinning, motion capture and playback, interchangeable avatars, animation for hands feet and faces&lt;br /&gt;
* '''Interoperability''': include ''class'' attribute for all nodes to all encodings&lt;br /&gt;
* '''[http://www.json.org JSON]''': JavaScript Object Notation as an X3D encoding ([http://web3d.org/pipermail/x3d-public_web3d.org/2014-July/thread.html#2854 assessment thread]), relation to [https://www.khronos.org/gltf GlTF], streaming considerations&lt;br /&gt;
*'''[http://www.web3d.org/realtime-3d/working-groups/medx3d Medical working group]''' capabilities&lt;br /&gt;
** [http://svn.xj3d.org/xj3d_website/trunk/extensions/annotation.html Annotations component] and metadata usage&lt;br /&gt;
** Archival 3D medical records, potential emphasis on [http://en.wikipedia.org/wiki/Traumatic_brain_injury Traumatic brain injury (TBI)] volume visualization&lt;br /&gt;
** Haptics component for force feedback&lt;br /&gt;
** Soft-body physics component to complement rigid-body physics component&lt;br /&gt;
* '''Mixed and Augmented Reality (MAR)''': integration of multiple capabilities with mobile devices&lt;br /&gt;
*'''Networking''': consider [http://www.web3d.org/x3d/content/examples/Basic/Networking NetworkSensor] and event-passing issues, streaming using [http://www.json.org JSON], server-side 3D topics&lt;br /&gt;
*'''Security and privacy''':&lt;br /&gt;
** [http://www.w3.org/standards/xml/security XML Security] provides best-available encryption, digital signature (authentication)&lt;br /&gt;
** [http://www.w3.org/standards/webdesign/privacy Web Privacy]: examine X3D compatibility with Do Not Track, P3P, POWDER&lt;br /&gt;
** Review X3D specifications to ensure that Security Considerations are fully documented&lt;br /&gt;
*'''Viewing and navigation''': cinematic camera control, alternative navigation types (such as PAN, [http://www.x3dom.org/?p=3536 TURNTABLE] etc.), [http://www.web3d.org/files/specifications/19775-1/V3.3/Part01/behaviours.html Recommended navigation behaviours] review&lt;br /&gt;
&lt;br /&gt;
All suggestions and recommendations are welcome. Component improvements and additions are approved by Web3D Consortium members.&lt;br /&gt;
&lt;br /&gt;
Please [http://www.web3d.org/realtime-3d/contact contact us] if you think additional technologies need to be considered.&lt;br /&gt;
&lt;br /&gt;
== Backwards and forwards compatibility ==&lt;br /&gt;
&lt;br /&gt;
Thanks to careful design and insistence on implementation/evaluation, the X3D International Standard has maintained both steady growth and interoperability ever since Virtual Reality Modeling Language (VRML) in 1997. This track record of stability and innovation is among the best in the 3D graphics industry.&lt;br /&gt;
&lt;br /&gt;
[[X3D version 4.0 Development]] efforts are focused on HTML5/Declarative 3D/X3DOM and Augmented Reality Continuum (ARC) technologies, which may require architectural changes. Some new technologies may get pushed from 4.0 to 3.4 (or back again) after careful consideration by the respective working groups.&lt;br /&gt;
&lt;br /&gt;
*As with all other X3D components, all work is defined in the abstract specification has corresponding file encodings (.x3d .x3dv .x3db) and language bindings (ECMAScript and Java). &lt;br /&gt;
*Compatibility concerns include evolutionary efforts to upgrade the X3D Compressed Binary Encoding (CBE), as described in the [http://www.web3d.org/realtime-3d/working-groups/x3d/compressed-binary/x3d-compressed-binary-encoding-call-contributions X3D Compressed Binary Encoding Call For Contributions].&lt;br /&gt;
*ECMAscript (JavaScript) support in X3D needs to be upgraded to the new standard for that rapidly improving programming language.&lt;br /&gt;
**[http://standards.iso.org/ittf/PubliclyAvailableStandards/c055755_ISO_IEC_16262_2011(E).zip ISO/IEC 16262:2011 Information technology — ECMAScript language specification] (.zip download)&lt;br /&gt;
**Downloadable from [http://standards.iso.org/ittf/PubliclyAvailableStandards/index.html ISO Publicly Available Standards] site without charge&lt;br /&gt;
**This relates to [http://www.web3d.org/files/specifications/19777-1/V3.0/index.html 19777-1 Part 2, X3D Scene Access Interface (SAI) language bindings for EcmaScript]&lt;br /&gt;
&lt;br /&gt;
== Schedule ==&lt;br /&gt;
&lt;br /&gt;
*'''X3D CADInterchange Profile goal.''' Implementations are complete and tested. The [http://www.web3d.org/files/specifications/19775-1/V3.3/Part01/CADInterchange.html X3D CADInterchange Profile] was completed as part of X3D version 3.3 during 2013.&lt;br /&gt;
*'''Mobile Profile.''' Calling out a reduced palette for mobile devices remains a potential goal for 2014, but might instead become part of X3D version 4.0 efforts.&lt;br /&gt;
*'''X3D Compressed Binary Encoding (CBE) goal.''' This work is proceeding in parallel.&lt;br /&gt;
*'''X3D version 3.4 goal.''' Review progress during SIGGRAPH 2014, continue work in parallel with X3D version 4.0. Web3D Consortium members decide when a draft specification proceeds to ISO.&lt;/div&gt;</summary>
		<author><name>Yjung</name></author>	</entry>

	<entry>
		<id>https://old.web3d.org/wiki/index.php?title=Comparison_of_X3D_AR_Proposals&amp;diff=5125</id>
		<title>Comparison of X3D AR Proposals</title>
		<link rel="alternate" type="text/html" href="https://old.web3d.org/wiki/index.php?title=Comparison_of_X3D_AR_Proposals&amp;diff=5125"/>
				<updated>2012-03-20T12:08:12Z</updated>
		
		<summary type="html">&lt;p&gt;Yjung: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;= Comparison of X3D AR Proposals - Working Draft =&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
By [http://www.web3d.org/x3d/wiki/index.php/X3D_and_Augmented_Reality Augmented Reality Working Group], Web3D Consortium&lt;br /&gt;
&lt;br /&gt;
Mar 21, 2012&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
== 1. Introduction ==&lt;br /&gt;
This document compares the existing proposals for extending X3D to support augmented and mixed reality visualization. Three main proposals are compared in terms of requirements – two from Korean Chapter (KC1 and KC2) and one from Instant Reality (IR). Proposal KC1 and KC2 are proposed by Korea Chapter members, Gun Lee and Gerard J. Kim, respectively, while proposal IR is from InstantReality developed by Fraunhofer IGD. The summary of each proposals can be found at [[X3D and Augmented Reality#Existing Proposals]]. The third proposal from Korea Chapter by Woontack Woo is not covered in this document since the proposal is not directly related to extending the X3D specification.&lt;br /&gt;
The criteria used for comparing each proposal is based on the requirements described at [[X3D AR Requirements and Use cases]]. In the rest of this document, each section compares the proposals in the aspect of one requirement, summarizing how each proposal deals with the requirement in the subsections, and concluding with discussion. After iterating through all of the requirements, we conclude with summary and discussion.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
== 2. Using Live Video stream as a texture ==&lt;br /&gt;
&lt;br /&gt;
=== 2.1 Proposal KC1 ===&lt;br /&gt;
This proposal proposed a new sensor node, CameraSensor (previously named LiveCamera node), for retrieving live video data from a camera device, and then routing the video stream to a PixelTexture node. The X3D browser is in charge of implementing and handling devices and mapping the video data to the CameraSensor node inside the X3D scene. The video stream itself is provided as a value (SFImage) field of the node which is updated every frame by the browser implementation according to the camera data.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
CameraSensor:X3DDirectSensorNode {    &lt;br /&gt;
   SFImage 	[out]		value    &lt;br /&gt;
   SFBool   	[out]         	on       	FALSE    &lt;br /&gt;
   SFMatrix4f	[out]		projmat   &amp;quot;1 0 0 0 … “    &lt;br /&gt;
   SFBool	[out]		tracking	FALSE    &lt;br /&gt;
   SFVec3f	[out]		position    &lt;br /&gt;
   SFRotation 	[out]		orientation  &lt;br /&gt;
}&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
While this straight forward, routing SFImage values might lead to performance and implementation problem. As an alternative, the same proposal also proposed to extend the behavior of the existing MovieTexture node to support live video stream within the node. The proposed behavior X3D browser is to allow users to select a file or a camera device for the MovieTexture node in the scene, if the url field of the node is empty (or filled with special token values, such as ‘USER_CUSTOMIZED’).&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;Appearance&amp;gt;&lt;br /&gt;
      &amp;lt;MovieTexture loop='true' url=''/&amp;gt; &lt;br /&gt;
&amp;lt;/Appearance&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
While this approach avoids performance problems by not exposing SFImage fields updated in real-time, it lacks of supports for using live video stream data for other purposes, such as background. This is to be solved partially by adding a new node MovieBackground, which behaves similarly to the MovieTexture but uses the user selected movie file or live video stream from a camera for filling the background of the 3D scene.&lt;br /&gt;
&lt;br /&gt;
=== 2.2 Proposal KC2 ===&lt;br /&gt;
This proposal uses similar approach to Proposal KC1 in terms of explicitly defining a sensor node that represents a camera device. The video stream on the image field of the LiveCamera node is then routed to a texture node.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
LiveCamera {&lt;br /&gt;
	SFString		[in, out]		source		“default”&lt;br /&gt;
	SFImage	[out]		image&lt;br /&gt;
	SFMatrix4f	[out]		projmat		“1 0 0 …”&lt;br /&gt;
	SFBool		[out]		on		FALSE&lt;br /&gt;
	SFBool		[out]		tracking		FALSE&lt;br /&gt;
	SFVec3f		[out]		position&lt;br /&gt;
	SFRotation	[out]		orientation&lt;br /&gt;
}&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
=== 2.3 Proposal IR ===&lt;br /&gt;
This proposal proposes a general purpose IOSensor node, which allows to access external devices (e.g., joysticks and cameras) inside the X3D scene. Note: Due to technical reasons the input and output fields of a device can only be determined at runtime. Hence, most in-/out-slots are dynamically generated based on the device one wants to access.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
&amp;lt;IOSensor type='' name='' description='' enabled='TRUE' /&amp;gt;&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
The camera sensor (including marker/poster/... tracking) is loaded through an instance of IOSensor, by defining the type of the sensor and it's fields as specified in the configFile (*.pm). Here is an example:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
&amp;lt;IOSensor DEF='VisionLib' type='VisionLib' configFile='TutorialMarkerTracking_OneMarker.pm'&amp;gt;&lt;br /&gt;
    &amp;lt;field accessType='outputOnly' name='VideoSourceImage' type='SFImage'/&amp;gt;&lt;br /&gt;
    &amp;lt;field accessType='outputOnly' name='TrackedObject1Camera_ModelView' type='SFMatrix4f'/&amp;gt;&lt;br /&gt;
    &amp;lt;field accessType='outputOnly' name='TrackedObject1Camera_PrincipalPoint' type='SFVec2f'/&amp;gt;&lt;br /&gt;
    &amp;lt;field accessType='outputOnly' name='TrackedObject1Camera_FOV_horizontal' type='SFFloat'/&amp;gt;&lt;br /&gt;
    &amp;lt;field accessType='outputOnly' name='TrackedObject1Camera_FOV_vertical' type='SFFloat'/&amp;gt;&lt;br /&gt;
    &amp;lt;field accessType='outputOnly' name='TrackedObject1Camera_CAM_aspect' type='SFFloat'/&amp;gt;&lt;br /&gt;
&amp;lt;/IOSensor&amp;gt;&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Using the camera image for texture is nothing more than routing the VideoSourceImage field of the IOSensor node to a PixelTexture node, which can also be part of a Background or Foreground appearance.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
=== 2.4 Discussion ===&lt;br /&gt;
Proposal KC1 and KC2 proposes a new node, specific for a camera, while proposal IR proposes a more generic type of node to be applied for variety of sensors. The trade off between simplicity and flexibility/extensibility needs further discussion.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
== 3. Using Live Video stream as a background ==&lt;br /&gt;
&lt;br /&gt;
=== 3.1 Proposal KC1 ===&lt;br /&gt;
The proposal proposed a MovieBackground node, extended from Background node to support ‘liveSource’ field which is assigned with a CameraSensor node (as described in 2.1) from which the Background node receives the live video stream data. Once the ‘liveSource’ field is assigned with a validate CameraSensor node, the background image is updated according to the live video stream from the CameraSensor node, assigned. For other purpose of use, it could also have a url field on which general source of movie clip could be assigned an used as a background.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
MovieBackground:X3DBackgroundNode {&lt;br /&gt;
     ... // same to the original Background node&lt;br /&gt;
     SFString    [in] url&lt;br /&gt;
     SFNode 	[in] liveSource&lt;br /&gt;
}&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Similar to the case in 2.1, the proposal also suggests a different approach where the MovieBackground node doesn’t explicitly need a CameraSensor node, but to let the browser to ask the user to choose the movie source (including camera device) when the url field is left empty (or filled with special token values, such as ‘USER_CUSTOMIZED’).&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
=== 3.2 Proposal KC2 ===&lt;br /&gt;
This proposal proposes to extend the TextureBackground node to support live video background. The video stream image is routed from the LiveCamera node to the frontTexture field. However, since TextureBackground node acts as an environment map there is a problem with the orientation of the TextureBackground, which is world registered and is not fixed to the viewpoint movement. To solve this problem, this proposal proposes to add a Boolean field called the ARmode. When the ARmode flag is true, the orientation of TextureBackground is fixed to the viewpoint, resulting the front side texture remains as a background. &lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
TextureBackground : X3DBackgroundNode &lt;br /&gt;
{&lt;br /&gt;
	SFBool 	        [in] 	        set_bind &lt;br /&gt;
	SFBool 	        [in] 	        ARmode&lt;br /&gt;
	MFFloat 	[in,out] 	groundAngle 	[] 	[0,π/2] &lt;br /&gt;
	MFColor 	[in,out] 	groundColor 	[] 	[0,1] &lt;br /&gt;
	SFNode	        [in,out] 	backTexture 	NULL 	[X3DTextureNode] &lt;br /&gt;
	SFNode	        [in,out] 	bottomTexture 	NULL 	[X3DTextureNode] &lt;br /&gt;
	SFNode	        [in,out] 	frontTexture 	NULL 	[X3DTextureNode] &lt;br /&gt;
	SFNode	        [in,out] 	leftTexture 	NULL 	[X3DTextureNode] &lt;br /&gt;
	SFNode	        [in,out] 	metadata 	NULL 	[X3DMetadataObject] &lt;br /&gt;
	SFNode	        [in,out] 	rightTexture 	NULL 	[X3DTextureNode] &lt;br /&gt;
	SFNode	        [in,out] 	topTexture 	NULL 	[X3DTextureNode] &lt;br /&gt;
	MFFloat 	[in,out] 	skyAngle 	[] 	[0,π] &lt;br /&gt;
	MFColor 	[in,out] 	skyColor 	0 0 0 	[0,1] &lt;br /&gt;
	SFFloat	        [in,out] 	transparency 	0 	[0,1] &lt;br /&gt;
	SFTime	        [out] 	        bindTime &lt;br /&gt;
	SFBool 	        [out] 	        isBound &lt;br /&gt;
} &lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
=== 3.3 Proposal IR ===&lt;br /&gt;
This proposal deals with the problem similar to the case for using the camera image for texture. It proposes a PolygonBackground node, which represents a background that renders a single polygon using the specified appearance. It allows for defining an aspect ratio of the background image that is independent of the actual window size. Different modes are possible to fit the image in the window (vertical or horizontal).&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
&amp;lt;PolygonBackground positions='0 0, 1 0, 1 1, 0 1' texCoords='0 0 0, 1 0 0, 1 1 0, 0 1 0' normalizedX='TRUE' normalizedY='TRUE' fixedImageSize='0,0' zoomFactor='1.0' tile='TRUE' doCleanup='TRUE' mode='VERTICAL' clearStencilBitplanes='-1' description='' /&amp;gt;&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Using the proposed PolygonBackground node, the image from the camera is simply routed to the texture used for the PolygonBackground node.&lt;br /&gt;
The image assigned to the image outslot of the IOSensor is routed to the texture in the appearance of the PolygonBackground node.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;PolygonBackground fixedImageSize='640,480' mode='VERTICAL'&amp;gt;&lt;br /&gt;
    &amp;lt;Appearance&amp;gt;&lt;br /&gt;
        &amp;lt;PixelTexture DEF='tex' /&amp;gt;&lt;br /&gt;
        &amp;lt;TextureTransform scale='1 -1'/&amp;gt;&lt;br /&gt;
    &amp;lt;/Appearance&amp;gt;&lt;br /&gt;
&amp;lt;/PolygonBackground&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;ROUTE fromNode='VisionLib' fromField='VideoSourceImage' toNode='tex' toField='image'/&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
To make the polygon for the background fill the viewport, the PolygonBackground's field fixedImageSize is used for describing the aspect ratio of the image, and the mode field is set to &amp;quot;VERTICAL&amp;quot; or &amp;quot;HORIZONTAL&amp;quot; which describes the way the polygon fits the viewport.&lt;br /&gt;
&lt;br /&gt;
Alternatively, for more simple cases, the ImageBackground node, which has texCoords and texture fields, can be used instead.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
=== 3.4 Discussion ===&lt;br /&gt;
Proposal KC1 proposes a dedicated node for movie backgrounds. Proposal KC2 proposes to extend TextureBackground node and repurpose it for fixed textured background. Proposal IR takes more general approach, and proposes a multi-purpose PolygonBackground node, which can contain any type of appearance including shaders. While the latter gives more flexibility, it requires details to be elaborated, compared to the former which is more simple. Again, the trade off between simplicity and flexibility/extensibility needs further discussion.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
== 4. Supporting color keying in texture ==&lt;br /&gt;
&lt;br /&gt;
=== 4.1 Proposal KC1 ===&lt;br /&gt;
This proposal proposed to add a ‘keyColor’ field to the MovieTexture node, which indicates the color expected to be rendered as transparent, in order to provide chroma key effect on the movie texture. The browser will be in charge of rendering the parts of the MovieTexture with as transparent, and those browser that does not support this feature could simply fall back with rendering the MovieTexture in a normal way (i.e. showing the texture as is).&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
MovieTexture:X3DTexture2DNode {&lt;br /&gt;
     ... // same to the MovieTexture node described in 2.1&lt;br /&gt;
SFColor    [in] keyColor&lt;br /&gt;
}&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== 4.2 Proposal KC2 ===&lt;br /&gt;
This proposal does not include this feature.&lt;br /&gt;
&lt;br /&gt;
=== 4.3 Proposal IR ===&lt;br /&gt;
This proposal doesn't include a direct solution to this case, since it is not straightforward related to AR applications. Closely related functions in this proposal would be the ColorMaskMode node, the BlendMode, StencilMode and DepthMode as a child of the Appearance node.&lt;br /&gt;
&lt;br /&gt;
The ColorMaskMode masks a specific color channel, and this results in color changes in the global image. Rather than resulting pixels in key color to appear transparent, the ColorMaskMode makes color changes in every pixel.&lt;br /&gt;
The ColorMaskMode together with the Appearance node's sortKey field (default sortKey is 0, a sortKey smaller than that is rendered first, and greater than another one is rendered last) can also be used to create invisible ghosting objects.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
&amp;lt;ColorMaskMode maskR='TRUE' maskG='TRUE' maskB='TRUE' maskA='TRUE' /&amp;gt;&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
The BlendMode gives general control over alpha blending function. However, there is no such function that compares the source images with a given key color, which is necessary to have proper result for color keying.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
&amp;lt;BlendMode srcFactor='src_alpha' destFactor='one_minus_src_alpha' color='1 1 1' colorTransparency='0' &lt;br /&gt;
 alphaFunc='none' alphaFuncValue='0' equation='none' /&amp;gt;&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
To achieve chroma keying for an arbitrary color, you can e.g. use a user defined shader that discards all fragments whose color is equal to the given one.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
=== 4.4 Discussion ===&lt;br /&gt;
Proposal KC1 suggests simpler way to provide a specific color keying function for textures, while proposal IR suggests a more generic functions that can achieve required function. Although, the corresponding nodes in proposal IR misses certain features to fulfill color keying out-of-the-box, this can be achieved via shaders. Again, the trade off between simplicity and flexibility/extensibility needs further discussion.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
== 5. Retrieving tracking information ==&lt;br /&gt;
&lt;br /&gt;
=== 5.1 Proposal KC1 ===&lt;br /&gt;
This proposal suggests using the same CameraSensor node, used for retrieving live video stream, for retrieving tracking information. As described in 2.1, the proposed CameraSensor node includes ‘position’ and ‘orientation’ fields that represent the tracking information of the camera motion. &lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
CameraSensor:X3DDirectSensorNode {    &lt;br /&gt;
   SFImage 	[out]		value    &lt;br /&gt;
   SFBool   	[out]         	on       	FALSE    &lt;br /&gt;
   SFMatrix4f	[out]		projmat   &amp;quot;1 0 0 0 … “    &lt;br /&gt;
   SFBool	[out]		tracking	FALSE    &lt;br /&gt;
   SFVec3f	[out]		position    &lt;br /&gt;
   SFRotation 	[out]		orientation  &lt;br /&gt;
}&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
The method has its limitations with not supporting tracking information of general objects other than the camera sensor. &lt;br /&gt;
&lt;br /&gt;
=== 5.2 Proposal KC2 ===&lt;br /&gt;
This proposal proposes a new node named &amp;quot;ImagePatch&amp;quot; which provides tracking information of a visual marker. In comparison with Proposal KC1, this is a separate node from a node that represents camera sensor. This allows using multiple visual markers for tracking.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
ImagePatch : X3DARNode&lt;br /&gt;
{&lt;br /&gt;
	MFString	[in, out]		filename&lt;br /&gt;
	SFVec3f	        [in, out]		position&lt;br /&gt;
	SFRotation	[in, out]		orientation&lt;br /&gt;
}&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
This proposal also proposes nodes for retrieving tracking information from sensors, other than vision based tracking. For instance, GPSLocation node provides tracking information from a GPS sensor.&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
GPSLocation : X3DSensorNode&lt;br /&gt;
{&lt;br /&gt;
	SFBool		[in, out]		status&lt;br /&gt;
	MFString	[in, out]		values&lt;br /&gt;
}&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
=== 5.3 Proposal IR ===&lt;br /&gt;
For retrieving tracking information, this proposal uses the same IOSensor node as used for retrieving camera image. In this example, the TrackedObject1Camera_ModelView field of the IOSensor node represents the transformation matrix of the tracked position/orientation of the tracked object (visual marker). However, these are all dynamic fields and depend on the configuration as defined in the pm file.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
&amp;lt;IOSensor DEF='VisionLib' type='VisionLib' configFile='TutorialMarkerTracking_OneMarker.pm'&amp;gt;&lt;br /&gt;
    &amp;lt;field accessType='outputOnly' name='VideoSourceImage' type='SFImage'/&amp;gt;&lt;br /&gt;
    &amp;lt;field accessType='outputOnly' name='TrackedObject1Camera_ModelView' type='SFMatrix4f'/&amp;gt;&lt;br /&gt;
    &amp;lt;field accessType='outputOnly' name='TrackedObject1Camera_PrincipalPoint' type='SFVec2f'/&amp;gt;&lt;br /&gt;
    &amp;lt;field accessType='outputOnly' name='TrackedObject1Camera_FOV_horizontal' type='SFFloat'/&amp;gt;&lt;br /&gt;
    &amp;lt;field accessType='outputOnly' name='TrackedObject1Camera_FOV_vertical' type='SFFloat'/&amp;gt;&lt;br /&gt;
    &amp;lt;field accessType='outputOnly' name='TrackedObject1Camera_CAM_aspect' type='SFFloat'/&amp;gt;&lt;br /&gt;
&amp;lt;/IOSensor&amp;gt;&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
The node could support multiple tracking objects by changing the configFile (TutorialMarkerTracking_OneMarker.pm file in the sample code), and defining additional Modelview, Projection, etc. fields for tracked objects and/or the camera pose.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
=== 5.4 Discussion ===&lt;br /&gt;
While both proposal KC1 and IR proposes to retrieve tracking information from a node that represents a camera sensor, proposal KC1 gives the tracking information of the camera, while proposal IR deals with the tracking information of tracked object. This makes proposal IR to be more extensible in terms of supporting multiple tracking objects. However, the method used for defining tracking objects and markers through proprietary configuration file  needs to be revised for standardization. On the other hand, proposal KC2 proposes a dedicated node for tracking, separated from a camera sensor node. As a result, multiple tracking objects are easily supported by creating multiple instances of this tracking node. Proposal KC2 also has a GPS tracking node, besides computer vision based tracking. GPS-based tracking should be investigated and compared to another proposal by Myeongwon Lee, which was originally discussed at X3D Earth working group [http://www.web3d.org/membership/login/memberwiki/index.php/X3D_v3.3_Specification_Changes#GpsSensor_node].&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
== 6. Using tracking information to change 3D scene ==&lt;br /&gt;
&lt;br /&gt;
=== 6.1 Proposal KC1 ===&lt;br /&gt;
This proposal proposes to use routing method to link tracking information from the CameraSensor node to a Viewpoint node’s position and orientation, in general. This could be also extended by a MatrixViewpoint node (to be described in 8.1) which could have a field to identify the corresponding CameraSensor node, causing the same results without explicitly routing the corresponding fields.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
=== 6.2 Proposal KC2 ===&lt;br /&gt;
This proposal also uses routing method for using tracking information in the 3D scene, routing tracking results (position and orientation) to transform nodes.&lt;br /&gt;
&lt;br /&gt;
Besides this basic method for using raw tracking information, this proposal also proposes higher level event nodes, such as VisibilitySensor and RangeSensor. VisibilitySensor node triggers events when whether tracking visual marker is detected or lost, while the RangeSensor node triggers event when a tracking object gets close enough within a certain range.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
VisibilitySensor : X3DEnvironmentalSensorNode &lt;br /&gt;
{&lt;br /&gt;
	SFBool		[in, out]		enabled&lt;br /&gt;
	SFTime		[out]		enterTime &lt;br /&gt;
	SFTime		[out]		exitTime &lt;br /&gt;
	SFBool		[out]		isActive &lt;br /&gt;
} &lt;br /&gt;
&lt;br /&gt;
RangeSensor : X3DEnvironmentalSensorNode&lt;br /&gt;
{&lt;br /&gt;
	SFBool		[in, out]		enabled&lt;br /&gt;
	SFTime		[out]		enterTime &lt;br /&gt;
	SFTime		[out]		exitTime &lt;br /&gt;
	SFBool		[out]		isActive &lt;br /&gt;
	SFInt32		[in, out]		sequence&lt;br /&gt;
	SFString		[in, out]		lBound &lt;br /&gt;
	SFString		[in, out]		uBound &lt;br /&gt;
} &lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
=== 6.3 Proposal IR ===&lt;br /&gt;
This proposal proposes to use the routing method to link tracking information from the IOSensor node to a Transform node of a corresponding virtual object or viewpoint. Example:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
&amp;lt;MatrixTransform DEF='TransformRelativeToCam'&amp;gt; &lt;br /&gt;
    &amp;lt;Shape&amp;gt; &lt;br /&gt;
        &amp;lt;Appearance&amp;gt; &lt;br /&gt;
            &amp;lt;Material diffuseColor='1 0.5 0' /&amp;gt; &lt;br /&gt;
        &amp;lt;/Appearance&amp;gt; &lt;br /&gt;
        &amp;lt;Teapot size='5 5 5' /&amp;gt; &lt;br /&gt;
    &amp;lt;/Shape&amp;gt; &lt;br /&gt;
&amp;lt;/MatrixTransform&amp;gt; &lt;br /&gt;
&lt;br /&gt;
&amp;lt;ROUTE fromNode='VisionLib' fromField='Camera_ModelView' toNode='TransformRelativeToCam' toField='set_matrix'/&amp;gt; &lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
For routing a transform matrix to a transform node, this proposal also proposes a MatrixTransform node that takes a transform matrix directly, rather than using position and orientation fields. The render field allows determining visibility.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
MatrixTransform : X3DGroupingNode {&lt;br /&gt;
 ...&lt;br /&gt;
 SFBool     [in,out] render TRUE&lt;br /&gt;
 SFMatrix4f [in,out] matrix identity&lt;br /&gt;
}&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
It is of course also possible to route the tracked camera pose (also in orientation/position notation) to the bound Viewpoint node.&lt;br /&gt;
There are different field-of-view modes: vertical, horizontal, and smaller. The field-of-view and principal point delivered by the IOSensor can be routed to the viewpoint; example below.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
&amp;lt;Viewpoint principalPoint='0 0' fieldOfView='0.785398' fovMode='SMALLER' aspect='1.0' retainUserOffsets='FALSE' &lt;br /&gt;
 zFar='-1' jump='TRUE' zNear='-1' description='' position='0 0 10' orientation='0 0 1 0' centerOfRotation='0 0 0' /&amp;gt;&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
&amp;lt;Viewpoint DEF='vp' position='0 0 0' fovMode='VERTICAL‘ /&amp;gt; &lt;br /&gt;
&lt;br /&gt;
&amp;lt;ROUTE fromNode='VisionLib' fromField='Camera_PrincipalPoint' toNode='vp' toField='principalPoint'/&amp;gt; &lt;br /&gt;
&amp;lt;ROUTE fromNode='VisionLib' fromField='Camera_FOV_vertical' toNode='vp' toField='fieldOfView'/&amp;gt; &lt;br /&gt;
&amp;lt;ROUTE fromNode='VisionLib' fromField='Camera_CAM_aspect' toNode='vp' toField='aspect'/&amp;gt; &lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
=== 6.4 Discussion ===&lt;br /&gt;
While all of the proposals relies on routing for applying tracking results for updating the 3D scene, as discussed in 5.4, proposal KC1 focuses on updating the Viewpoint node, while proposal KC2 and IR allows updating both, the camera as well as a virtual object (or scene). Proposal IR also proposes a new type of transformation node for dealing with transformation matrices, too, while proposal KC1 sticks to traditional position and orientation vectors.&lt;br /&gt;
In addition, proposal KC2 proposes higher level event generation nodes that triggers tracking based events such as proximity and visibility.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
== 7. Retrieving camera calibration (internal parameters) information ==&lt;br /&gt;
&lt;br /&gt;
=== 7.1 Proposal KC1 ===&lt;br /&gt;
This proposal suggests using the same CameraSensor node, used for retrieving live video stream, for retrieving camera calibration information. As described in 2.1, the proposed CameraSensor node includes a ‘projmat’ field which represents the calibration information of the CameraSensor.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
CameraSensor:X3DDirectSensorNode {    &lt;br /&gt;
   SFImage 	[out]		value    &lt;br /&gt;
   SFBool   	[out]         	on       	FALSE    &lt;br /&gt;
   SFMatrix4f	[out]		projmat   &amp;quot;1 0 0 0 … “    &lt;br /&gt;
   SFBool	[out]		tracking	FALSE    &lt;br /&gt;
   SFVec3f	[out]		position    &lt;br /&gt;
   SFRotation 	[out]		orientation  &lt;br /&gt;
}&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
=== 7.2 Proposal KC2 ===&lt;br /&gt;
This proposal takes similar approach to proposal KC1, providing a field representing camera calibration information in the node for live video camera.&lt;br /&gt;
&lt;br /&gt;
=== 7.3 Proposal IR ===&lt;br /&gt;
This proposal suggests using the same IOSensor node, used for retrieving images from camera sensor. Several fields (in this example they are called e.g. TrackedObject1Camera_PrincipalPoint, TrackedObject1Camera_FOV_horizontal, TrackedObject1Camera_FOV_vertical, TrackedObject1Camera_CAM_aspect) in this node provide the calibration information. &lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
&amp;lt;IOSensor DEF='VisionLib' type='VisionLib' configFile='TutorialMarkerTracking_OneMarker.pm'&amp;gt;&lt;br /&gt;
    &amp;lt;field accessType='outputOnly' name='VideoSourceImage' type='SFImage'/&amp;gt;&lt;br /&gt;
    &amp;lt;field accessType='outputOnly' name='TrackedObject1Camera_ModelView' type='SFMatrix4f'/&amp;gt;&lt;br /&gt;
    &amp;lt;field accessType='outputOnly' name='TrackedObject1Camera_PrincipalPoint' type='SFVec2f'/&amp;gt;&lt;br /&gt;
    &amp;lt;field accessType='outputOnly' name='TrackedObject1Camera_FOV_horizontal' type='SFFloat'/&amp;gt;&lt;br /&gt;
    &amp;lt;field accessType='outputOnly' name='TrackedObject1Camera_FOV_vertical' type='SFFloat'/&amp;gt;&lt;br /&gt;
    &amp;lt;field accessType='outputOnly' name='TrackedObject1Camera_CAM_aspect' type='SFFloat'/&amp;gt;&lt;br /&gt;
&amp;lt;/IOSensor&amp;gt;&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
=== 7.4 Discussion ===&lt;br /&gt;
All three proposals suggest reusing the node that is used for accessing a camera sensor, and using a dedicated field of that node for providing camera calibration information. While proposal KC1 and KC2 suggest using projection matrix as a calibration information, proposal IR suggests using a set of parameters. The latter approach could be safer in terms of encapsulating the projection matrix of a viewpoint which could be implementation dependent based on what graphics API it is using.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
== 8. Using calibration information to set properties of (virtual) camera ==&lt;br /&gt;
&lt;br /&gt;
=== 8.1 Proposal KC1 ===&lt;br /&gt;
This proposal suggests a MatrixViewpoint node, which is a child of a scene node which represents a virtual viewpoint calibrated according to the corresponding physical live video camera (on the user's computer). The 'projmat' field represents the internal parameters (or projection matrix) of the MatrixViewpoint. The ‘position' and ‘orientation’ fields represent three dimensional position and orientation of the viewpoint within the virtual space. The ‘cameraSensor’ field represents a CameraSensor node, from which the viewpoint parameters (including projmat, position and orientation fields) of the MatrixViewpoint are updated according to. Once the ‘cameraSensor’ field is assigned with a validate CameraSensor node, the viewpoint parameters are updated according to the values from the CameraSensor node, assigned. Otherwise, it could be also used with routing each parameter of the MatrixViewpoint node from corresponding source of calibrated values.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
MatrixViewpoint : X3DViewpointNode{&lt;br /&gt;
     SFMatrix4f 		[in,out]	projmat&lt;br /&gt;
     SFVec3f 		[in,out]	position&lt;br /&gt;
     SFRotation 		[in,out]	orientation&lt;br /&gt;
     SFNode 		[in,out]	cameraSensor&lt;br /&gt;
}&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
=== 8.2 Proposal KC2 ===&lt;br /&gt;
This proposal suggests similar approach with proposal KC1, which uses a viewpoint node that accepts camera calibration information in matrix form.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
=== 8.3 Proposal IR ===&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
Viewpoint : X3DViewpointNode {&lt;br /&gt;
  ...&lt;br /&gt;
  SFString [in,out] fovMode        VERTICAL&lt;br /&gt;
  SFVec2f  [in,out] principalPoint 0 0&lt;br /&gt;
  SFFloat  [in,out] aspect         1.0&lt;br /&gt;
  SFFloat  [in,out] zNear          -1&lt;br /&gt;
  SFFloat  [in,out] zFar           -1&lt;br /&gt;
}&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
The new fields provide a more general camera model than the standard Viewpoint. The &amp;quot;principalPoint&amp;quot; field defines the relative position of the principal point. If the principal point is not equal to zero, the viewing frustum parameters (left, right, top, bottom) are simply shifted in the camera's image plane. A value of x = 2 means the left value is equal to the default right value. A value of x = -2 means the right value is equal to default. If the principal point is not equal to zero, the &amp;quot;fieldOfView&amp;quot; value is not equal to the real field of view of the camera, otherwise it complies with the default settings. &lt;br /&gt;
&lt;br /&gt;
To extend this idea, the &amp;quot;fovMode&amp;quot; defines whether the field of view is measured vertically, horizontally or in the smaller direction, which is important for correctly parameterizing the aforementioned cinematographic camera.&lt;br /&gt;
The field ``aspect'' defines the aspect ratio for the viewing angle defined by the &amp;quot;fieldOfView&amp;quot; range. This setting is independent of the current aspect ratio of the window, but reflects the aspect ratio of the actual capturing device. This extension allows us to model cameras with a non-quadratic pixel format, i.e. it defines (width / height) of a pixel.&lt;br /&gt;
&lt;br /&gt;
In addition to the Viewpoint extension we include a new camera node named Viewfrustum. This node has the two input/output fields &amp;quot;modelview&amp;quot; and &amp;quot;projection&amp;quot; of type SFMatrix4f. With the Viewfrustum node we are able to define a camera position and projection utilizing a standard projection/ modelview matrix pair.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
Viewfrustum : X3DViewpointNode {&lt;br /&gt;
  ...&lt;br /&gt;
  SFMatrix4f [in,out] modelview  (identity)&lt;br /&gt;
  SFMatrix4f [in,out] projection (identity)&lt;br /&gt;
}&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
=== 8.4 Discussion ===&lt;br /&gt;
All of the proposals propose a new type of Viewpoint nodes to support camera calibration information described in section 7. While they use different type and number of fields for representing the camera calibration information, they all use same routing method to apply these value to a Viewpoint node. As discussed in 7.4, assigning a projection matrix directly to a viewpoint may result in defects, such as incorrect projections or near-far clipping planes.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
== 9. Specifying nodes as physical object representatives ==&lt;br /&gt;
&lt;br /&gt;
=== 9.1 Proposal KC1 ===&lt;br /&gt;
This proposal suggests a GhostGroup node for indicating its child nodes being representatives of physical objects for visualizing correct occlusion. The proposed node is extended from Group node to support those geometries of its child nodes are rendered as ghost objects. The browser should render the child nodes only into the depth buffer and not into the color buffer. As a result, the portion of the live video image corresponding to the ghost object is visualized with correct depth value, forming correct occlusion with other virtual objects.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
GhostGroup: X3DGroupingNode{&lt;br /&gt;
     ... // same to the original Group node&lt;br /&gt;
}&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
=== 9.2 Proposal KC2 ===&lt;br /&gt;
This proposal does not include this feature.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
=== 9.3 Proposal IR ===&lt;br /&gt;
This proposal proposes using a ColorMaskMode node to render the geometry not into color buffer, and only to the depth buffer. In addition, a new field &amp;quot;sortKey&amp;quot; is proposed for the Appearance node for making sure the ghost objects are rendered prior to other geometries.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
&amp;lt;Shape&amp;gt;&lt;br /&gt;
   &amp;lt;Appearance sortKey='-1'&amp;gt;&lt;br /&gt;
     &amp;lt;ColorMaskMode maskR='false' maskG='false' maskB='false' maskA='false'/&amp;gt;&lt;br /&gt;
   &amp;lt;/Appearance&amp;gt;&lt;br /&gt;
   ...&lt;br /&gt;
&amp;lt;/Shape&amp;gt;&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
=== 9.4 Discussion ===&lt;br /&gt;
While proposal KC1 suggests a high level, simple to use approach for a specific application in AR/MR depth occlusion visualization, proposal IR suggest general purpose detail control of the rendering process. Proposal KC1 directly deals with depth buffer values, providing general-case solution for depth occlusion problem. In comparison, proposal IR uses color masking technique to mimic the depth occlusion effect, which could have limitations with incorrect results in dynamic scenes.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
== 10. Conclusion ==&lt;br /&gt;
Table 1 summarizes the difference between proposals, showing what modifications are proposed in each proposals (column) in terms of each functional requirements (row).&lt;br /&gt;
&lt;br /&gt;
{| border='1' &lt;br /&gt;
|+ Table 1. Comparison of X3D AR proposals ('''Bold''': newly proposed nodes, ''Italic'': modification to standard nodes)&lt;br /&gt;
!  !! width=&amp;quot;27%&amp;quot;|Proposal KC1 !! width=&amp;quot;27%&amp;quot;|Proposal KC2 !! width=&amp;quot;27%&amp;quot;|Proposal IR&lt;br /&gt;
|-&lt;br /&gt;
| Using Live Video stream as a texture&lt;br /&gt;
| ''MovieTexture'' node ( or optionally with routing from '''CameraSensor''' node)&lt;br /&gt;
| '''LiveCamera''' node, routing to a PixelTexture node&lt;br /&gt;
| '''IOSensor''' node, routing to a PixelTexture node&lt;br /&gt;
|-&lt;br /&gt;
| Using Live Video stream as a background&lt;br /&gt;
| '''MovieBackground''' node ( or optionally with routing from '''CameraSensor''' node)&lt;br /&gt;
| '''LiveCamera''' node + ''TextureBackground'' node&lt;br /&gt;
| '''IOSensor''' node + '''PolygonBackground''' node (or optionally '''ImageBackground''' node)&lt;br /&gt;
|-&lt;br /&gt;
| Supporting color keying in texture&lt;br /&gt;
| ''MovieTexture'' node&lt;br /&gt;
| N/A&lt;br /&gt;
| N/A (use general shader support)&lt;br /&gt;
|-&lt;br /&gt;
| Retrieving tracking information&lt;br /&gt;
| '''CameraSensor''' node&lt;br /&gt;
| '''ImagePatch''' and '''GPSSensor''' node&lt;br /&gt;
| '''IOSensor''' node&lt;br /&gt;
|-&lt;br /&gt;
| Using tracking information to change 3D scene&lt;br /&gt;
| routing from '''CameraSensor''' node&lt;br /&gt;
| '''VisibilitySensor''' and '''RangeSensor''' nodes&lt;br /&gt;
| routing from '''IOSensor''' node&lt;br /&gt;
|-&lt;br /&gt;
| Retrieving camera calibration (internal parameters) information&lt;br /&gt;
| '''CameraSensor''' node&lt;br /&gt;
| '''LiveCamera''' node&lt;br /&gt;
| '''IOSensor''' node&lt;br /&gt;
|-&lt;br /&gt;
| Using calibration information to set properties of (virtual) camera&lt;br /&gt;
| '''MatrixViewpoint''' node&lt;br /&gt;
| ''Viewpoint'' node&lt;br /&gt;
| '''Viewfrustum''' and ''Viewpoint'' nodes (alternatively '''MatrixTransform''' node)&lt;br /&gt;
|-&lt;br /&gt;
| Specifying nodes as physical object representatives&lt;br /&gt;
| '''GhostGroup''' node&lt;br /&gt;
| N/A&lt;br /&gt;
| '''ColorMaskMode''' and ''Appearance'' nodes (together with sortKey field)&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
While all of the proposals suggest and cover similar set of functionalities required for supporting AR and MR visualization in X3D, Proposal KC1 and KC2 take the path of relatively higher level control, that provides simpler syntax that could be applied for specific cases for AR and MR. On the contrary, proposal IR introduces more generic purpose nodes and suggests to combine these nodes to implement required AR functions, dealing AR and MR visualization as a special use case of proposed extension. Considering the difference between the proposals, trade off between simplicity and flexibility/extensibility needs further discussion as AR WG proceeds to develop specifications for AR visualization components.&lt;br /&gt;
&lt;br /&gt;
In the content authors' point of view, providing higher-level abstracted control gives more simpler and easy-to-use syntax. However, detail control might be missing which could be necessary for applications other than common AR/MR visualization.&lt;br /&gt;
&lt;br /&gt;
In the browser implementors' point of view, encapsulating the functions into higher-level components gives more room to choose their own way to implement the given function. However, if further detail control is required and added later for other applications, this could affect the ways how previous higher level components are implemented and may result in need for change in implementation level. Testing each function would be more complicated if low level details are accessible to scene authors, since there are more cases to test in order to make sure each combination of low level components work together in general case.&lt;br /&gt;
&lt;br /&gt;
Providing both options could be an alternative, providing multiple choices to the content authors. However, this would give more burdens to the browser implementors, and the specification development would take more effort. Especially when considering the fact, that AR and tracking methods are still a moving target that is far from standardization.&lt;/div&gt;</summary>
		<author><name>Yjung</name></author>	</entry>

	<entry>
		<id>https://old.web3d.org/wiki/index.php?title=Comparison_of_X3D_AR_Proposals&amp;diff=4982</id>
		<title>Comparison of X3D AR Proposals</title>
		<link rel="alternate" type="text/html" href="https://old.web3d.org/wiki/index.php?title=Comparison_of_X3D_AR_Proposals&amp;diff=4982"/>
				<updated>2012-02-07T11:26:29Z</updated>
		
		<summary type="html">&lt;p&gt;Yjung: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;= Comparison between existing proposals - Working Draft =&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
Augmented Reality Working Group&lt;br /&gt;
Web3D Consortium&lt;br /&gt;
&lt;br /&gt;
Jan 25, 2012&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
== 1. Introduction ==&lt;br /&gt;
&lt;br /&gt;
This document compares the existing proposals for extending X3D to support augmented and mixed reality visualization. Three (?) main proposals are compared in terms of requirements – two from Korean Chapter (A, B) and one from Instant Reality (C).&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
== 2. Using Live Video stream as a texture ==&lt;br /&gt;
&lt;br /&gt;
=== 2.1 Proposal A ===&lt;br /&gt;
This proposal proposed a new sensor node, CameraSensor (previously named LiveCamera node), for retrieving live video data from a camera device, and then routing the video stream to a PixelTexture node. The X3D browser is in charge of implementing and handling devices and mapping the video data to the CameraSensor node inside the X3D scene. The video stream itself is provided as a value (SFImage) field of the node which is updated every frame by the browser implementation according to the camera data.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
CameraSensor:X3DDirectSensorNode {    &lt;br /&gt;
   SFImage 	[out]		value    &lt;br /&gt;
   SFBool   	[out]         	on       	FALSE    &lt;br /&gt;
   SFMatrix4f	[out]		projmat   &amp;quot;1 0 0 0 … “    &lt;br /&gt;
   SFBool	[out]		tracking	FALSE    &lt;br /&gt;
   SFVec3f	[out]		position    &lt;br /&gt;
   SFRotation 	[out]		orientation  &lt;br /&gt;
}&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
While this straight forward, routing SFImage values might lead to performance and implementation problem. As an alternative, the same proposal also proposed to extend the behavior of the existing MovieTexture node to support live video stream within the node. The proposed behavior X3D browser is to allow users to select a file or a camera device for the MovieTexture node in the scene, if the url field of the node is empty (or filled with special token values, such as ‘USER_CUSTOMIZED’).&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;Appearance&amp;gt;&lt;br /&gt;
      &amp;lt;MovieTexture loop='true' url=''/&amp;gt; &lt;br /&gt;
&amp;lt;/Appearance&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
While this approach avoids performance problems by not exposing SFImage fields updated in real-time, it lacks of supports for using live video stream data for other purposes, such as background. This is to be solved partially by adding a new node MovieBackground, which behaves similarly to the MovieTexture but uses the user selected movie file or live video stream from a camera for filling the background of the 3D scene.&lt;br /&gt;
&lt;br /&gt;
=== 2.2 Proposal B ===&lt;br /&gt;
The proposal from Gerard Kim, in Korea Chapter, proposed a new sensor node, , …&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
=== 2.3 Proposal C ===&lt;br /&gt;
This proposal proposes a general purpose IOSensor node, which allows to access external devices (e.g., joysticks and cameras) inside the X3D scene. Note: Due to technical reasons the input and output fields of a device can only be determined at runtime. Hence, most in-/out-slots are dynamically generated based on the device one wants to access.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
&amp;lt;IOSensor type='' name='' description='' enabled='TRUE' /&amp;gt;&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
The camera sensor (including marker/poster/... tracking) is loaded through an instance of IOSensor, by defining the type of the sensor and it's fields as specified in the configFile (*.pm). Here is an example:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
&amp;lt;IOSensor DEF='VisionLib' type='VisionLib' configFile='TutorialMarkerTracking_OneMarker.pm'&amp;gt;&lt;br /&gt;
    &amp;lt;field accessType='outputOnly' name='VideoSourceImage' type='SFImage'/&amp;gt;&lt;br /&gt;
    &amp;lt;field accessType='outputOnly' name='TrackedObject1Camera_ModelView' type='SFMatrix4f'/&amp;gt;&lt;br /&gt;
    &amp;lt;field accessType='outputOnly' name='TrackedObject1Camera_PrincipalPoint' type='SFVec2f'/&amp;gt;&lt;br /&gt;
    &amp;lt;field accessType='outputOnly' name='TrackedObject1Camera_FOV_horizontal' type='SFFloat'/&amp;gt;&lt;br /&gt;
    &amp;lt;field accessType='outputOnly' name='TrackedObject1Camera_FOV_vertical' type='SFFloat'/&amp;gt;&lt;br /&gt;
    &amp;lt;field accessType='outputOnly' name='TrackedObject1Camera_CAM_aspect' type='SFFloat'/&amp;gt;&lt;br /&gt;
&amp;lt;/IOSensor&amp;gt;&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Using the camera image for texture is nothing more than routing the VideoSourceImage field of the IOSensor node to a PixelTexture node, which can also be part of a Background or Foreground appearance.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
=== 2.4 Discussion ===&lt;br /&gt;
Proposal A and B proposes a new node, specific for a camera, while C proposes a more generic type of node to be applied for variety of sensors. The trade off between simplicity and flexibility/extensibility needs further discussion.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
== 3. Using Live Video stream as a background ==&lt;br /&gt;
&lt;br /&gt;
=== 3.1 Proposal A ===&lt;br /&gt;
The proposal proposed a MovieBackground node, extended from Background node to support ‘liveSource’ field which is assigned with a CameraSensor node (as described in 2.1) from which the Background node receives the live video stream data. Once the ‘liveSource’ field is assigned with a validate CameraSensor node, the background image is updated according to the live video stream from the CameraSensor node, assigned. For other purpose of use, it could also have a url field on which general source of movie clip could be assigned an used as a background.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
MovieBackground:X3DBackgroundNode {&lt;br /&gt;
     ... // same to the original Background node&lt;br /&gt;
     SFString    [in] url&lt;br /&gt;
     SFNode 	[in] liveSource&lt;br /&gt;
}&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Similar to the case in 2.1, the proposal also suggests a different approach where the MovieBackground node doesn’t explicitly need a CameraSensor node, but to let the browser to ask the user to choose the movie source (including camera device) when the url field is left empty (or filled with special token values, such as ‘USER_CUSTOMIZED’).&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
=== 3.3 Proposal C ===&lt;br /&gt;
This proposal deals with the problem similar to the case for using the camera image for texture. It proposes a PolygonBackground node, which represents a background that renders a single polygon using the specified appearance. It allows for defining an aspect ratio of the background image that is independent of the actual window size. Different modes are possible to fit the image in the window (vertical or horizontal).&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
&amp;lt;PolygonBackground positions='0 0, 1 0, 1 1, 0 1' texCoords='0 0 0, 1 0 0, 1 1 0, 0 1 0' normalizedX='TRUE' normalizedY='TRUE' fixedImageSize='0,0' zoomFactor='1.0' tile='TRUE' doCleanup='TRUE' mode='VERTICAL' clearStencilBitplanes='-1' description='' /&amp;gt;&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Using the proposed PolygonBackground node, the image from the camera is simply routed to the texture used for the PolygonBackground node.&lt;br /&gt;
The image assigned to the image outslot of the IOSensor is routed to the texture in the appearance of the PolygonBackground node.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;PolygonBackground fixedImageSize='640,480' mode='VERTICAL'&amp;gt;&lt;br /&gt;
    &amp;lt;Appearance&amp;gt;&lt;br /&gt;
        &amp;lt;PixelTexture DEF='tex' /&amp;gt;&lt;br /&gt;
        &amp;lt;TextureTransform scale='1 -1'/&amp;gt;&lt;br /&gt;
    &amp;lt;/Appearance&amp;gt;&lt;br /&gt;
&amp;lt;/PolygonBackground&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;ROUTE fromNode='VisionLib' fromField='VideoSourceImage' toNode='tex' toField='image'/&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
To make the polygon for the background fill the viewport, the PolygonBackground's field fixedImageSize is used for describing the aspect ratio of the image, and the mode field is set to &amp;quot;VERTICAL&amp;quot; or &amp;quot;HORIZONTAL&amp;quot; which describes the way the polygon fits the viewport.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
=== 3.4 Discussion ===&lt;br /&gt;
Proposal A proposes a dedicated node for movie backgrounds, while proposal C proposes a multi-purpose PolygonBackground node, which can contain any type of appearance including shaders. While the latter gives more flexibility, it requires details to be elaborated, compared to the former which is more simple. Again, the trade off between simplicity and flexibility/extensibility needs further discussion.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
== 4. Supporting color keying in texture ==&lt;br /&gt;
&lt;br /&gt;
=== 4.1 Proposal A ===&lt;br /&gt;
This proposal proposed to add a ‘keyColor’ field to the MovieTexture node, which indicates the color expected to be rendered as transparent, in order to provide chroma key effect on the movie texture. The browser will be in charge of rendering the parts of the MovieTexture with as transparent, and those browser that does not support this feature could simply fall back with rendering the MovieTexture in a normal way (i.e. showing the texture as is).&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
MovieTexture:X3DTexture2DNode {&lt;br /&gt;
     ... // same to the MovieTexture node described in 2.1&lt;br /&gt;
SFColor    [in] keyColor&lt;br /&gt;
}&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
=== 4.3 Proposal C ===&lt;br /&gt;
This proposal doesn't include a direct solution to this case, since it is not straightforward related to AR applications. Closely related functions in this proposal would be the ColorMaskMode node, the BlendMode, StencilMode and DepthMode as a child of the Appearance node.&lt;br /&gt;
&lt;br /&gt;
The ColorMaskMode masks a specific color channel, and this results in color changes in the global image. Rather than resulting pixels in key color to appear transparent, the ColorMaskMode makes color changes in every pixel.&lt;br /&gt;
The ColorMaskMode together with the Appearance node's sortKey field (default sortKey is 0, a sortKey smaller than that is rendered first, and greater than another one is rendered last) can also be used to create invisible ghosting objects.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
&amp;lt;ColorMaskMode maskR='TRUE' maskG='TRUE' maskB='TRUE' maskA='TRUE' /&amp;gt;&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
The BlendMode gives general control over alpha blending function. However, there is no such function that compares the source images with a given key color, which is necessary to have proper result for color keying.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
&amp;lt;BlendMode srcFactor='src_alpha' destFactor='one_minus_src_alpha' color='1 1 1' colorTransparency='0' &lt;br /&gt;
 alphaFunc='none' alphaFuncValue='0' equation='none' /&amp;gt;&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
To achieve chroma keying for an arbitrary color, you can e.g. use a user defined shader that discards all fragments whose color is equal to the given one.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
=== 4.4 Discussion ===&lt;br /&gt;
Proposal A suggests simpler way to provide a specific color keying function for textures, while C suggests a more generic functions that can achieve required function. Although, the corresponding nodes in proposal C misses certain features to fulfill color keying out-of-the-box, this can be achieved via shaders. Again, the trade off between simplicity and flexibility/extensibility needs further discussion.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
== 5. Retrieving tracking information ==&lt;br /&gt;
&lt;br /&gt;
=== 5.1 Proposal A ===&lt;br /&gt;
This proposal suggests using the same CameraSensor node, used for retrieving live video stream, for retrieving tracking information. As described in 2.1, the proposed CameraSensor node includes ‘position’ and ‘orientation’ fields that represent the tracking information of the camera motion. &lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
CameraSensor:X3DDirectSensorNode {    &lt;br /&gt;
   SFImage 	[out]		value    &lt;br /&gt;
   SFBool   	[out]         	on       	FALSE    &lt;br /&gt;
   SFMatrix4f	[out]		projmat   &amp;quot;1 0 0 0 … “    &lt;br /&gt;
   SFBool	[out]		tracking	FALSE    &lt;br /&gt;
   SFVec3f	[out]		position    &lt;br /&gt;
   SFRotation 	[out]		orientation  &lt;br /&gt;
}&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
The method has its limitations with not supporting tracking information of general objects other than the camera sensor. &lt;br /&gt;
&lt;br /&gt;
=== 5.2 Proposal B ===&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
=== 5.3 Proposal C ===&lt;br /&gt;
For retrieving tracking information, this proposal uses the same IOSensor node as used for retrieving camera image. In this example, the TrackedObject1Camera_ModelView field of the IOSensor node represents the transformation matrix of the tracked position/orientation of the tracked object (visual marker). However, these are all dynamic fields and depend on the configuration as defined in the pm file.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
&amp;lt;IOSensor DEF='VisionLib' type='VisionLib' configFile='TutorialMarkerTracking_OneMarker.pm'&amp;gt;&lt;br /&gt;
    &amp;lt;field accessType='outputOnly' name='VideoSourceImage' type='SFImage'/&amp;gt;&lt;br /&gt;
    &amp;lt;field accessType='outputOnly' name='TrackedObject1Camera_ModelView' type='SFMatrix4f'/&amp;gt;&lt;br /&gt;
    &amp;lt;field accessType='outputOnly' name='TrackedObject1Camera_PrincipalPoint' type='SFVec2f'/&amp;gt;&lt;br /&gt;
    &amp;lt;field accessType='outputOnly' name='TrackedObject1Camera_FOV_horizontal' type='SFFloat'/&amp;gt;&lt;br /&gt;
    &amp;lt;field accessType='outputOnly' name='TrackedObject1Camera_FOV_vertical' type='SFFloat'/&amp;gt;&lt;br /&gt;
    &amp;lt;field accessType='outputOnly' name='TrackedObject1Camera_CAM_aspect' type='SFFloat'/&amp;gt;&lt;br /&gt;
&amp;lt;/IOSensor&amp;gt;&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
The node could support multiple tracking objects by changing the configFile (TutorialMarkerTracking_OneMarker.pm file in the sample code), and defining additional Modelview, Projection, etc. fields for tracked objects and/or the camera pose.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
=== 5.4 Discussion ===&lt;br /&gt;
While both proposes to retrieve tracking information from a node that represents a camera sensor, proposal A gives the tracking information of the camera, while C deals with the tracking information of tracked object. This makes proposal C to be more extensible in terms of supporting multiple tracking objects. However, the method used for defining tracking objects and markers through proprietary configuration file  needs to be revised for standardization.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
== 6. Using tracking information to change 3D scene ==&lt;br /&gt;
&lt;br /&gt;
=== 6.1 Proposal A ===&lt;br /&gt;
This proposal proposes to use routing method to link tracking information from the CameraSensor node to a Viewpoint node’s position and orientation, in general. This could be also extended by a MatrixViewpoint node (to be described in 8.1) which could have a field to identify the corresponding CameraSensor node, causing the same results without explicitly routing the corresponding fields.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
=== 6.3 Proposal C ===&lt;br /&gt;
This proposal proposes to use the routing method to link tracking information from the IOSensor node to a Transform node of a corresponding virtual object or viewpoint. Example:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
&amp;lt;MatrixTransform DEF='TransformRelativeToCam'&amp;gt; &lt;br /&gt;
    &amp;lt;Shape&amp;gt; &lt;br /&gt;
        &amp;lt;Appearance&amp;gt; &lt;br /&gt;
            &amp;lt;Material diffuseColor='1 0.5 0' /&amp;gt; &lt;br /&gt;
        &amp;lt;/Appearance&amp;gt; &lt;br /&gt;
        &amp;lt;Teapot size='5 5 5' /&amp;gt; &lt;br /&gt;
    &amp;lt;/Shape&amp;gt; &lt;br /&gt;
&amp;lt;/MatrixTransform&amp;gt; &lt;br /&gt;
&lt;br /&gt;
&amp;lt;ROUTE fromNode='VisionLib' fromField='Camera_ModelView' toNode='TransformRelativeToCam' toField='set_matrix'/&amp;gt; &lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
For routing a transform matrix to a transform node, this proposal also proposes a MatrixTransform node that takes a transform matrix directly, rather than using position and orientation fields. The render field allows determining visibility.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
MatrixTransform : X3DGroupingNode {&lt;br /&gt;
 ...&lt;br /&gt;
 SFBool     [in,out] render TRUE&lt;br /&gt;
 SFMatrix4f [in,out] matrix identity&lt;br /&gt;
}&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
It is of course also possible to route the tracked camera pose (also in orientation/position notation) to the bound Viewpoint node.&lt;br /&gt;
There are different field-of-view modes: vertical, horizontal, and smaller. The field-of-view and principal point delivered by the IOSensor can be routed to the viewpoint; example below.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
&amp;lt;Viewpoint principalPoint='0 0' fieldOfView='0.785398' fovMode='SMALLER' aspect='1.0' retainUserOffsets='FALSE' &lt;br /&gt;
 zFar='-1' jump='TRUE' zNear='-1' description='' position='0 0 10' orientation='0 0 1 0' centerOfRotation='0 0 0' /&amp;gt;&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
&amp;lt;Viewpoint DEF='vp' position='0 0 0' fovMode='VERTICAL‘ /&amp;gt; &lt;br /&gt;
&lt;br /&gt;
&amp;lt;ROUTE fromNode='VisionLib' fromField='Camera_PrincipalPoint' toNode='vp' toField='principalPoint'/&amp;gt; &lt;br /&gt;
&amp;lt;ROUTE fromNode='VisionLib' fromField='Camera_FOV_vertical' toNode='vp' toField='fieldOfView'/&amp;gt; &lt;br /&gt;
&amp;lt;ROUTE fromNode='VisionLib' fromField='Camera_CAM_aspect' toNode='vp' toField='aspect'/&amp;gt; &lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
=== 6.4 Discussion ===&lt;br /&gt;
While both proposals relies on routing for applying tracking results for updating the 3D scene, as discussed in 5.4, proposal A focuses on updating the Viewpoint node, while proposal C allows updating both, the camera as well as a virtual object (or scene). Proposal C also proposes a new type of transformation node for dealing with transformation matrices, too, while A sticks to traditional position and orientation vectors.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
== 7. Retrieving camera calibration (internal parameters) information ==&lt;br /&gt;
&lt;br /&gt;
=== 7.1 Proposal A ===&lt;br /&gt;
This proposal suggests using the same CameraSensor node, used for retrieving live video stream, for retrieving camera calibration information. As described in 2.1, the proposed CameraSensor node includes a ‘projmat’ field which represents the calibration information of the CameraSensor.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
CameraSensor:X3DDirectSensorNode {    &lt;br /&gt;
   SFImage 	[out]		value    &lt;br /&gt;
   SFBool   	[out]         	on       	FALSE    &lt;br /&gt;
   SFMatrix4f	[out]		projmat   &amp;quot;1 0 0 0 … “    &lt;br /&gt;
   SFBool	[out]		tracking	FALSE    &lt;br /&gt;
   SFVec3f	[out]		position    &lt;br /&gt;
   SFRotation 	[out]		orientation  &lt;br /&gt;
}&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
=== 7.3 Proposal C ===&lt;br /&gt;
This proposal suggests using the same IOSensor node, used for retrieving images from camera sensor. Several fields (in this example they are called e.g. TrackedObject1Camera_PrincipalPoint, TrackedObject1Camera_FOV_horizontal, TrackedObject1Camera_FOV_vertical, TrackedObject1Camera_CAM_aspect) in this node provide the calibration information. &lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
&amp;lt;IOSensor DEF='VisionLib' type='VisionLib' configFile='TutorialMarkerTracking_OneMarker.pm'&amp;gt;&lt;br /&gt;
    &amp;lt;field accessType='outputOnly' name='VideoSourceImage' type='SFImage'/&amp;gt;&lt;br /&gt;
    &amp;lt;field accessType='outputOnly' name='TrackedObject1Camera_ModelView' type='SFMatrix4f'/&amp;gt;&lt;br /&gt;
    &amp;lt;field accessType='outputOnly' name='TrackedObject1Camera_PrincipalPoint' type='SFVec2f'/&amp;gt;&lt;br /&gt;
    &amp;lt;field accessType='outputOnly' name='TrackedObject1Camera_FOV_horizontal' type='SFFloat'/&amp;gt;&lt;br /&gt;
    &amp;lt;field accessType='outputOnly' name='TrackedObject1Camera_FOV_vertical' type='SFFloat'/&amp;gt;&lt;br /&gt;
    &amp;lt;field accessType='outputOnly' name='TrackedObject1Camera_CAM_aspect' type='SFFloat'/&amp;gt;&lt;br /&gt;
&amp;lt;/IOSensor&amp;gt;&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
=== 7.4 Discussion ===&lt;br /&gt;
Both proposals suggest reusing the node that is used for accessing a camera sensor, and using a dedicated field of that node for providing camera calibration information. While proposal A suggests to use projection matrix as a calibration information, C suggests using a set of parameters. The latter approach could be safer in terms of encapsulating the projection matrix of a viewpoint which could be implementation dependent based on what graphics API it is using.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
== 8. Using calibration information to set properties of (virtual) camera ==&lt;br /&gt;
&lt;br /&gt;
=== 8.1 Proposal A ===&lt;br /&gt;
This proposal suggests a MatrixViewpoint node, which is a child of a scene node which represents a virtual viewpoint calibrated according to the corresponding physical live video camera (on the user's computer). The 'projmat' field represents the internal parameters (or projection matrix) of the MatrixViewpoint. The ‘position' and ‘orientation’ fields represent three dimensional position and orientation of the viewpoint within the virtual space. The ‘cameraSensor’ field represents a CameraSensor node, from which the viewpoint parameters (including projmat, position and orientation fields) of the MatrixViewpoint are updated according to. Once the ‘cameraSensor’ field is assigned with a validate CameraSensor node, the viewpoint parameters are updated according to the values from the CameraSensor node, assigned. Otherwise, it could be also used with routing each parameter of the MatrixViewpoint node from corresponding source of calibrated values.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
MatrixViewpoint : X3DViewpointNode{&lt;br /&gt;
     SFMatrix4f 		[in,out]	projmat&lt;br /&gt;
     SFVec3f 		[in,out]	position&lt;br /&gt;
     SFRotation 		[in,out]	orientation&lt;br /&gt;
     SFNode 		[in,out]	cameraSensor&lt;br /&gt;
}&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
=== 8.3 Proposal C ===&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
Viewpoint : X3DViewpointNode {&lt;br /&gt;
  ...&lt;br /&gt;
  SFString [in,out] fovMode        VERTICAL&lt;br /&gt;
  SFVec2f  [in,out] principalPoint 0 0&lt;br /&gt;
  SFFloat  [in,out] aspect         1.0&lt;br /&gt;
  SFFloat  [in,out] zNear          -1&lt;br /&gt;
  SFFloat  [in,out] zFar           -1&lt;br /&gt;
}&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
The new fields provide a more general camera model than the standard Viewpoint. The &amp;quot;principalPoint&amp;quot; field defines the relative position of the principal point. If the principal point is not equal to zero, the viewing frustum parameters (left, right, top, bottom) are simply shifted in the camera's image plane. A value of x = 2 means the left value is equal to the default right value. A value of x = -2 means the right value is equal to default. If the principal point is not equal to zero, the &amp;quot;fieldOfView&amp;quot; value is not equal to the real field of view of the camera, otherwise it complies with the default settings. &lt;br /&gt;
&lt;br /&gt;
To extend this idea, the &amp;quot;fovMode&amp;quot; defines whether the field of view is measured vertically, horizontally or in the smaller direction, which is important for correctly parameterizing the aforementioned cinematographic camera.&lt;br /&gt;
The field ``aspect'' defines the aspect ratio for the viewing angle defined by the &amp;quot;fieldOfView&amp;quot; range. This setting is independent of the current aspect ratio of the window, but reflects the aspect ratio of the actual capturing device. This extension allows us to model cameras with a non-quadratic pixel format, i.e. it defines (width / height) of a pixel.&lt;br /&gt;
&lt;br /&gt;
In addition to the Viewpoint extension we include a new camera node named Viewfrustum. This node has the two input/output fields &amp;quot;modelview&amp;quot; and &amp;quot;projection&amp;quot; of type SFMatrix4f. With the Viewfrustum node we are able to define a camera position and projection utilizing a standard projection/ modelview matrix pair.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
Viewfrustum : X3DViewpointNode {&lt;br /&gt;
  ...&lt;br /&gt;
  SFMatrix4f [in,out] modelview  (identity)&lt;br /&gt;
  SFMatrix4f [in,out] projection (identity)&lt;br /&gt;
}&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
=== 8.4 Discussion ===&lt;br /&gt;
Both proposal proposes a new type of Viewpoint nodes to support camera calibration information described in section 7. While they use different type and number of fields for representing the camera calibration information, they both use same routing method to apply these value to a Viewpoint node. As discussed in 7.4, assigning a projection matrix directly to a viewpoint may result in defects, such as incorrect projections or near-far clipping planes.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
== 9. Specifying nodes as physical object representatives ==&lt;br /&gt;
&lt;br /&gt;
=== 9.1 Proposal A ===&lt;br /&gt;
This proposal suggests a GhostGroup node for indicating its child nodes being representatives of physical objects for visualizing correct occlusion. The proposed node is extended from Group node to support those geometries of its child nodes are rendered as ghost objects. The browser should render the child nodes only into the depth buffer and not into the color buffer. As a result, the portion of the live video image corresponding to the ghost object is visualized with correct depth value, forming correct occlusion with other virtual objects.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
Group: X3DGroupingNode{&lt;br /&gt;
     ... // same to the original Group node&lt;br /&gt;
}&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
=== 9.3 Proposal C ===&lt;br /&gt;
This proposal proposes using a ColorMaskMode node to render the geometry not into color buffer, and only to the depth buffer. In addition, a new field &amp;quot;sortKey&amp;quot; is proposed for the Appearance node for making sure the ghost objects are rendered prior to other geometries.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
&amp;lt;Shape&amp;gt;&lt;br /&gt;
   &amp;lt;Appearance sortKey='-1'&amp;gt;&lt;br /&gt;
     &amp;lt;ColorMaskMode maskR='false' maskG='false' maskB='false' maskA='false'/&amp;gt;&lt;br /&gt;
   &amp;lt;/Appearance&amp;gt;&lt;br /&gt;
   ...&lt;br /&gt;
&amp;lt;/Shape&amp;gt;&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
=== 9.4 Discussion ===&lt;br /&gt;
While proposal A suggests a high level, simple to use approach for a specific application in AR/MR, proposal C provides more detail controllable, general purpose approach.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
== 10. Conclusion ==&lt;br /&gt;
While both proposals suggest and cover similar set of functionalities required for supporting AR and MR visualization in X3D, Proposal A takes the path of higher level control, that provides simpler syntax that could be applied for specific cases for AR and MR. On the contrary, proposal C introduces more generic purpose nodes and suggests to combine them to implement required functions, making AR and MR visualization a special use case of proposed extension. Considering the difference between the proposals, trade off between simplicity and flexibility/extensibility leads to a need for further discussion.&lt;br /&gt;
&lt;br /&gt;
In the content authors' point of view, providing higher-level abstracted control gives more simpler and easy to use syntax. However, detail control might be missing which could be necessary for applications other than AR/MR. &lt;br /&gt;
&lt;br /&gt;
In the browser implementor point of view, encapsulating the functions into higher level apis gives more room to choose their own way to implement the given function. On the other hand, if further detail control is required and added later for other applications, this could affect the ways how previous higher level components are implemented and may result in need for change in implementation level. Testing each function would be more complicated if low level details are accessible to scene authors, since there are more cases to test in order to make sure it works in general case.&lt;br /&gt;
&lt;br /&gt;
Providing both in parallel could be an alternative, however, this would give more burdens to the browser implementors.&lt;/div&gt;</summary>
		<author><name>Yjung</name></author>	</entry>

	<entry>
		<id>https://old.web3d.org/wiki/index.php?title=Comparison_of_X3D_AR_Proposals&amp;diff=4873</id>
		<title>Comparison of X3D AR Proposals</title>
		<link rel="alternate" type="text/html" href="https://old.web3d.org/wiki/index.php?title=Comparison_of_X3D_AR_Proposals&amp;diff=4873"/>
				<updated>2011-12-21T09:31:44Z</updated>
		
		<summary type="html">&lt;p&gt;Yjung: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;= Comparison between existing proposals - Working Draft =&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
Augmented Reality Working Group&lt;br /&gt;
Web3D Consortium&lt;br /&gt;
&lt;br /&gt;
July 20, 2011&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
== 1. Introduction ==&lt;br /&gt;
&lt;br /&gt;
This document compares the existing proposals for extending X3D to support augmented and mixed reality visualization. Three (?) main proposals are compared in terms of requirements – two from Korean chapter and one from Instant Reality.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
== 2. Using Live Video stream as a texture ==&lt;br /&gt;
&lt;br /&gt;
=== 2.1 Proposal A ===&lt;br /&gt;
This proposal proposed a new sensor node, CameraSensor (previously named LiveCamera node), for retrieving live video data from a camera device, and then routing the video stream to a PixelTexture node. The X3D browser is in charge of implementing and handling devices and mapping the video data to the CameraSensor node inside the X3D scene. The video stream itself is provided as a value (SFImage) field of the node which is updated every frame by the browser implementation according to the camera data.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
CameraSensor:X3DDirectSensorNode {    &lt;br /&gt;
   SFImage 	[out]		value    &lt;br /&gt;
   SFBool   	[out]         	on       	FALSE    &lt;br /&gt;
   SFMatrix4f	[out]		projmat   &amp;quot;1 0 0 0 … “    &lt;br /&gt;
   SFBool	[out]		tracking	FALSE    &lt;br /&gt;
   SFVec3f	[out]		position    &lt;br /&gt;
   SFRotation 	[out]		orientation  &lt;br /&gt;
}&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
While this straight forward, routing SFImage values might lead to performance and implementation problem. As an alternative, the same proposal also proposed to extend the behavior of the existing MovieTexture node to support live video stream within the node. The proposed behavior X3D browser is to allow users to select a file or a camera device for the MovieTexture node in the scene, if the url field of the node is empty (or filled with special token values, such as ‘USER_CUSTOMIZED’).&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;Appearance&amp;gt;&lt;br /&gt;
      &amp;lt;MovieTexture loop='true'   url=''/&amp;gt; &lt;br /&gt;
&amp;lt;/Appearance&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
While this approach avoids performance problems by not exposing SFImage fields updated in real-time, it lacks of supports for using live video stream data for other purposes, such as background. This is to be solved partially by adding a new node MovieBackground, which behaves similarly to the MovieTexture but uses the user selected movie file or live video stream from a camera for filling the background of the 3D scene.&lt;br /&gt;
&lt;br /&gt;
=== 2.2 Proposal B ===&lt;br /&gt;
The proposal from Gerard Kim, in Korea Chapter, proposed a new sensor node, , …&lt;br /&gt;
&lt;br /&gt;
=== 2.3 Proposal C ===&lt;br /&gt;
&lt;br /&gt;
The proposal from Instant Reality proposed a new sensor node, …&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
=== 2.4 Discussion ===&lt;br /&gt;
These three proposals included similar structures and nodes … &lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
== 3. Using Live Video stream as a background ==&lt;br /&gt;
&lt;br /&gt;
=== 3.1 Proposal A ===&lt;br /&gt;
The proposal proposed a MovieBackground node, extended from Background node to support ‘liveSource’ field which is assigned with a CameraSensor node (as described in 2.1) from which the Background node receives the live video stream data. Once the ‘liveSource’ field is assigned with a validate CameraSensor node, the background image is updated according to the live video stream from the CameraSensor node, assigned. For other purpose of use, it could also have a url field on which general source of movie clip could be assigned an used as a background.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
MovieBackground:X3DBackgroundNode {&lt;br /&gt;
     ... // same to the original Background node&lt;br /&gt;
     SFString    [in] url&lt;br /&gt;
     SFNode 	[in] liveSource&lt;br /&gt;
}&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Similar to the case in 2.1, the proposal also suggests a different approach where the MovieBackground node doesn’t explicitly need a CameraSensor node, but to let the browser to ask the user to choose the movie source (including camera device) when the url field is left empty (or filled with special token values, such as ‘USER_CUSTOMIZED’).&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
=== 3.3 Proposal C ===&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
== 4. Supporting color keying in texture ==&lt;br /&gt;
&lt;br /&gt;
=== 4.1 Proposal A ===&lt;br /&gt;
This proposal proposed to add a ‘keyColor’ field to the MovieTexture node, which indicates the color expected to be rendered as transparent, in order to provide chroma key effect on the movie texture. The browser will be in charge of rendering the parts of the MovieTexture with as transparent, and those browser that does not support this feature could simply fall back with rendering the MovieTexture in a normal way (i.e. showing the texture as is).&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
MovieTexture:X3DBackgroundNode {&lt;br /&gt;
     ... // same to the MovieTexture node described in 2.1&lt;br /&gt;
SFColor    [in] keyColor&lt;br /&gt;
}&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
=== 4.3 Proposal C ===&lt;br /&gt;
&lt;br /&gt;
We introduce a new background node, the PolygonBackground. It allows for defining an aspect ratio of the background image that is independent of the actual window size. Different modes are possible to fit the image in the window (vertical or horizontal). The image assigned to the image outslot of the IOSensor is routed to the texture in the appearance of the PolygonBackground node. Here is an example:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
&amp;lt;PolygonBackground fixed ImageSize='640,480' mode='VERTICAL'&amp;gt; &lt;br /&gt;
    &amp;lt;Appearance&amp;gt; &lt;br /&gt;
        &amp;lt;PixelTexture2D DEF='tex' /&amp;gt; &lt;br /&gt;
    &amp;lt;/Appearance&amp;gt; &lt;br /&gt;
&amp;lt;/PolygonBackground&amp;gt; &lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&amp;lt;ROUTE fromNode='VisionLib' fromField='VideoSourceImage' toNode='tex' toField='image'/&amp;gt; &lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
== 5. Retrieving tracking information ==&lt;br /&gt;
&lt;br /&gt;
=== 5.1 Proposal A ===&lt;br /&gt;
This proposal does not define an explicit way to interface tracking information, but suggests using the same CameraSensor node, used for retrieving live video stream, for retrieving tracking information. As described in 2.1, the proposed CameraSensor node includes ‘position’ and ‘orientation’ fields that represent the tracking information of the camera motion. The method has its limitations with not supporting tracking information of general objects other than the camera sensor. &lt;br /&gt;
&lt;br /&gt;
== 6. Using tracking information to change 3D scene ==&lt;br /&gt;
&lt;br /&gt;
=== 6.1 Proposal A ===&lt;br /&gt;
This proposal does not propose any new node or function, but to use routing method to link tracking information from the CameraSensor node to a Viewpoint node’s position and orientation, in general. This could be also extended by a MatrixViewpoint node (to be described in 8.1) which could have a field to identify the corresponding CameraSensor node, causing the same results without explicitly routing the corresponding fields.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
== 7. Retrieving camera calibration information ==&lt;br /&gt;
&lt;br /&gt;
=== 7.1 Proposal A ===&lt;br /&gt;
This proposal doesn’t define an explicit way to interface tracking information, but suggests using the same CameraSensor node, used for retrieving live video stream, for retrieving camera calibration information. As described in 2.1, the proposed CameraSensor node includes a ‘projmat’ field which represents the calibration information of the CameraSensor.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
== 8. Using calibration information to set properties of (virtual) camera ==&lt;br /&gt;
&lt;br /&gt;
=== 8.1 Proposal A ===&lt;br /&gt;
This proposal suggests a MatrixViewpoint node, which is a child of a scene node which represents a virtual viewpoint calibrated according to the corresponding physical live video camera (on the user's computer). The 'projmat' field represents the internal parameters (or projection matrix) of the MatrixViewpoint. The ‘position' and ‘orientation’ fields represent three dimensional position and orientation of the viewpoint within the virtual space. The ‘cameraSensor’ field represents a CameraSensor node, from which the viewpoint parameters (including projmat, position and orientation fields) of the MatrixViewpoint are updated according to. Once the ‘cameraSensor’ field is assigned with a validate CameraSensor node, the viewpoint parameters are updated according to the values from the CameraSensor node, assigned. Otherwise, it could be also used with routing each parameter of the MatrixViewpoint node from corresponding source of calibrated values.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
MatrixViewpoint : X3DViewpointNode{&lt;br /&gt;
     SFMatrix4f 		[in,out]	projmat&lt;br /&gt;
     SFVec3f 		[in,out]	position&lt;br /&gt;
     SFRotation 		[in,out]	orientation&lt;br /&gt;
     SFNode 		[in,out]	cameraSensor&lt;br /&gt;
}&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
=== 8.3 Proposal C ===&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
Viewpoint : X3DViewpointNode {&lt;br /&gt;
  ...&lt;br /&gt;
  SFString [in,out] fovMode        VERTICAL&lt;br /&gt;
  SFVec2f  [in,out] principalPoint 0 0&lt;br /&gt;
  SFFloat  [in,out] aspect         1.0&lt;br /&gt;
}&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
The new fields provide a more general camera model than the standard Viewpoint. The ``principalPoint'' field defines the relative position of the principal point. If the principal point is not equal to zero, the viewing frustum parameters (left, right, top, bottom) are simply shifted in the camera's image plane. A value of x = 2 means the left value is equal to the default right value. A value of x = -2 means the right value is equal to default. If the principal point is not equal to zero, the ``fieldOfView'' value is not equal to the real field of view of the camera, otherwise it complies with the default settings. &lt;br /&gt;
&lt;br /&gt;
To extend this idea, the ``fovMode'' defines whether the field of view is measured vertically, horizontally or in the smaller direction, which is important for correctly parameterizing the aforementioned cinematographic camera.&lt;br /&gt;
The field ``aspect'' defines the aspect ratio for the viewing angle defined by the ``fieldOfView'' range. This setting is independent of the current aspect ratio of the window, but reflects the aspect ratio of the actual capturing device. This extension allows us to model cameras with a non-quadratic pixel format, i.e. it defines (width / height) of a pixel.&lt;br /&gt;
&lt;br /&gt;
In addition to the Viewpoint extension we include a new camera node named Viewfrustum. This node has the two input/output fields ``modelview'' and ``projection'' of type SFMatrix4f. With the Viewfrustum node we are able to define a camera position and projection utilizing a standard projection/ modelview matrix pair.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
Viewfrustum : X3DViewpointNode {&lt;br /&gt;
  ...&lt;br /&gt;
  SFMatrix4f [in,out] modelview  (identity)&lt;br /&gt;
  SFMatrix4f [in,out] projection (identity)&lt;br /&gt;
}&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
MatrixTransform : X3DGroupingNode {&lt;br /&gt;
 ...&lt;br /&gt;
 SFBool     [in,out] render TRUE&lt;br /&gt;
 SFMatrix4f [in,out] matrix identity&lt;br /&gt;
}&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Accordingly, we also propose a new transform node type, the MatrixTransform. The modelview matrix delivered by the IOSensor node can be applied to a MatrixTransform node. The objects that are superimposed (i.e. the “augmentations”) are children of this MatrixTransform. Here is an example:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
&amp;lt;MatrixTransform DEF='TransformRelativeToCam'&amp;gt; &lt;br /&gt;
    &amp;lt;Shape&amp;gt; &lt;br /&gt;
        &amp;lt;Appearance&amp;gt; &lt;br /&gt;
            &amp;lt;Material diffuseColor='1 0.5 0' /&amp;gt; &lt;br /&gt;
        &amp;lt;/Appearance&amp;gt; &lt;br /&gt;
        &amp;lt;Teapot size='5 5 5' /&amp;gt; &lt;br /&gt;
    &amp;lt;/Shape&amp;gt; &lt;br /&gt;
&amp;lt;/MatrixTransform&amp;gt; &lt;br /&gt;
&lt;br /&gt;
&amp;lt;ROUTE fromNode='VisionLib' fromField='Camera_ModelView' toNode='TransformRelativeToCam' toField='set_matrix'/&amp;gt; &lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
== 9. Specifying nodes as physical object representatives ==&lt;br /&gt;
&lt;br /&gt;
=== 9.1 Proposal A ===&lt;br /&gt;
This proposal suggests a GhostGroup node for indicating its child nodes being representatives of physical objects for visualizing correct occlusion. The proposed node is extended from Group node to support those geometries of its child nodes are rendered as ghost objects. The browser should render the child nodes only into the depth buffer and not into the color buffer. As a result, the portion of the live video image corresponding to the ghost object is visualized with correct depth value, forming correct occlusion with other virtual objects.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
Group: X3DGroupingNode{&lt;br /&gt;
     ... // same to the original Group node&lt;br /&gt;
}&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
=== 9.3 Proposal C ===&lt;br /&gt;
&lt;br /&gt;
See http://www.web3d.org/x3d/wiki/index.php/X3D_and_Augmented_Reality&lt;/div&gt;</summary>
		<author><name>Yjung</name></author>	</entry>

	<entry>
		<id>https://old.web3d.org/wiki/index.php?title=X3D_and_Augmented_Reality&amp;diff=3766</id>
		<title>X3D and Augmented Reality</title>
		<link rel="alternate" type="text/html" href="https://old.web3d.org/wiki/index.php?title=X3D_and_Augmented_Reality&amp;diff=3766"/>
				<updated>2011-05-25T08:30:56Z</updated>
		
		<summary type="html">&lt;p&gt;Yjung: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;== History and Background Information ==&lt;br /&gt;
Web3D Consortium formed a special interest group on AR initiatives in July 2009 worked to help create the AR Technology Road Map.&lt;br /&gt;
&lt;br /&gt;
Several Web3D Consortium member projects showcase the feasibility of AR in X3D, particularly [http://www.x3dom.org X3DOM] open source produced by [http://www.igd.fhg.de/www/igd-a4 Fraunhofer IGD]. &lt;br /&gt;
&lt;br /&gt;
* X3DOM can serve as an out of the box, standards-based solution for AR developers.&lt;br /&gt;
&lt;br /&gt;
* X3D and X3DOM continue to improve scripting options for X3D with HTML5 and this has good promise for AR applications. &lt;br /&gt;
&lt;br /&gt;
* X3DOM is being considered for potential standardization in a Mobile and/or Augmented Reality (AR) profile for X3D.&lt;br /&gt;
&lt;br /&gt;
Our [http://www.web3d.kr Web3D Korea Chapter members] from ETRI are working on Mixed Reality visualization in X3D.&lt;br /&gt;
&lt;br /&gt;
* The Consortium as been working closely within W3C HTML5 WG to align our standards for 3D visualization on the Web. &lt;br /&gt;
&lt;br /&gt;
Additional details are available at: &lt;br /&gt;
&lt;br /&gt;
* [http://www.web3d.org/x3d/wiki/index.php/X3D_and_HTML5 X3D and HTML5]&lt;br /&gt;
&lt;br /&gt;
* [http://www.web3d.org/x3d/wiki/index.php/X3D_and_HTML_FAQ X3D and HTML FAQ]&lt;br /&gt;
&lt;br /&gt;
* [http://www.x3dom.org X3DOM]&lt;br /&gt;
&lt;br /&gt;
Web3D Consortium is also engaged with the [http://www.arstandards.org AR standards], [http://opengeospacial.org  OGC], [http://www.khronos.org Khronos] and [http://w3c.org W3C] organizations for applying and adapting X3D.&lt;br /&gt;
&lt;br /&gt;
* Web3D Consortium will be starting an AR working group in March 2011 to develop a roadmap for AR standardization. We encourage your participation. &lt;br /&gt;
&lt;br /&gt;
* The technology discussions and meetings will be conducted publicly so that the widest possible comment and reaction can be considered from the Community. Feedback from this community will help X3D quickly and stably adopt new technologies to provide ongoing archival value for all 3D graphics.&lt;br /&gt;
&lt;br /&gt;
* Meanwhile Web3D Members still retain important membership rights of proposing significant new technology and considering patented technologies within a &amp;quot;safe haven&amp;quot; prior to public release.&lt;br /&gt;
&lt;br /&gt;
[http://www.web3d.org/x3d/wiki/images/3/32/X3dProfilePossibilitiesMobileHtml5AR.2010June29.pdf X3D Mobile Profile slideset]&lt;br /&gt;
from last summer's Mobile X3D ISO Workshop has also been&lt;br /&gt;
linked: how Mobile, HTML5 and possibly Augmented Reality (AR) components&lt;br /&gt;
can be aligned together.&lt;br /&gt;
&lt;br /&gt;
Many new Web3D capabilities are becoming available.  There has been no better time to discuss X3D technologies and also join the [http://web3d.org/membersip/join Web3D Consortium]. Get involved in these early discussions to create the AR Technology Road Map. We look forward to your participation.&lt;br /&gt;
&lt;br /&gt;
= Calendar =&lt;br /&gt;
&lt;br /&gt;
[http://www.perey.com/ARStandards/third-international-ar-standards-meeting/ Third International AR Standards Meeting, June 15-17, Taichung, Taiwan]&lt;br /&gt;
&lt;br /&gt;
[http://web3d2011.org/ Augmented/Mixed Reality Workshop at Web3D Conference, 10:50AM-12:10PM Local Time,  June 21, Paris, France]&lt;br /&gt;
&lt;br /&gt;
[http://www.siggraph.org/s2011/for_attendees/birds-feather Augmented and Mixed Reality Web3D BOF at SIGGRAPH 2011, August 2011, Vancouver, Canada]&lt;br /&gt;
&lt;br /&gt;
SC24 Augmented and Mixed Reality Study Group Meeting  @ SC24 Plenary and Working Group Meetings, August 21 2011, Rapid City, South Dakota, USA&lt;br /&gt;
&lt;br /&gt;
= Draft Charter =&lt;br /&gt;
&lt;br /&gt;
Adding points to this section will be the subject of the next meetings.&lt;br /&gt;
&lt;br /&gt;
== Overview ==&lt;br /&gt;
The Augmented Reality (AR) Working Group focuses on utilizing and extending X3D capabilities to support augmented reality (AR) and mixed reality (MR) applications.&lt;br /&gt;
&lt;br /&gt;
''Discussion.''  These X3D AR discussions were initially held as part of a special interest group. Now that we have determined sufficient exists to modify the X3D Specification, this effort is moving forward by forming the X3D AR Working Group.&lt;br /&gt;
&lt;br /&gt;
== Goals ==&lt;br /&gt;
Planned goals of AR WG include:&lt;br /&gt;
* Collect requirements and describe typical use cases for using X3D in AR/MR applications&lt;br /&gt;
* Develop and extend functions and nodes for X3D specification required to support AR/MR applications&lt;br /&gt;
* Produce sample AR/MR applications using X3D to demonstrate how this functionality can work correctly&lt;br /&gt;
* Produce and X3D propose Specification changes, possibly include an AR Component and/or a Mobile Profile&lt;br /&gt;
&lt;br /&gt;
== Tasks ==&lt;br /&gt;
&lt;br /&gt;
TODO these tasks should probably be rewritten as separate distinct efforts that support the Goals&lt;br /&gt;
&lt;br /&gt;
* Investigate state-of-the-art technologies related to X3D and AR/MR&lt;br /&gt;
* Develop and extend X3D specification to support AR/MR applications&lt;br /&gt;
* Promote X3D in AR/MR field by developing and demonstrating use cases&lt;br /&gt;
** videos are particularly compelling and more informative that a 3D demo because they show the use of AR/MR in the context of the real world&lt;br /&gt;
&lt;br /&gt;
== Deliverables and Timeline ==&lt;br /&gt;
* Use cases of X3D for AR/MR application - summer 2011&lt;br /&gt;
* Requirements for X3D to support AR/MR applications - summer 2011&lt;br /&gt;
* Proposed new/extended functions and nodes for X3D specification&lt;br /&gt;
** once use cases and requirements are stated, can compare existing and new proposals for X3D functionality&lt;br /&gt;
* Sample AR/MR applications with X3D&lt;br /&gt;
** these will be produced in support of each proposal&lt;br /&gt;
&lt;br /&gt;
== Meetings ==&lt;br /&gt;
Our twice-monthly teleconference for X3D and Augmented Reality is usually&lt;br /&gt;
* Together with Korea Chapter Meeting:&lt;br /&gt;
1710-1810 (Pacific time)/2010-2110 (Eastern time)  on 1st Wednesday, which is 0910-1010 (Korea time) on 1st Thursday.&lt;br /&gt;
* AR only Meeting:&lt;br /&gt;
1710-1810 (Pacific time)/2010-2110 (Eastern time)  on 3rd Wednesday, which is 0910-1010 (Korea time) on 3rd Thursday.&lt;br /&gt;
&lt;br /&gt;
Participation is open to everyone via the Web3D teleconference line.  Non-members can [mailto:anita.havele@web3d.org?subject=AR%20teleconference%20request request access information] for this call, or [http://web3d.org/membership/join Join Web3D]!&lt;br /&gt;
&lt;br /&gt;
Our next public teleconference is __________ 2011.&lt;br /&gt;
&lt;br /&gt;
Meeting agenda:&lt;br /&gt;
TBA through the mailing list.&lt;br /&gt;
&lt;br /&gt;
Discussions occur on the [mailto:x3d-public@web3d.org x3d-public@web3d.org mailing list].  If the email traffic becomes very busy then we can create a separate email list.&lt;br /&gt;
&lt;br /&gt;
Meeting minutes are also distributed on the X3D-Public mailing list and [http://web3d.org/pipermail/x3d-public_web3d.org/ archived online].&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
= Existing Proposals =&lt;br /&gt;
&lt;br /&gt;
== Instant Reality ==&lt;br /&gt;
&lt;br /&gt;
Instant Reality is a Mixed Reality framework developed and maintained by Fraunhofer IGD, which uses X3D as application description language and which thus also provides corresponding nodes and concepts for developing AR/MR applications. The full node documentation can be found on [http://doc.instantreality.org/documentation/ IR-Docs].&lt;br /&gt;
&lt;br /&gt;
There also exist several tutorials on vision-based tracking with Instant Reality, which e.g. describe specific nodes like the IOSensor node for retrieving the camera streams and the tracking results of the vision subsystem, or which discuss the new PolygonBackground node for displaying the camera images behind the virtual objects as well as some useful camera extensions to the X3D Viewpoint node, etc.: [http://doc.instantreality.org/tutorial/ Tracking-Tutorial].&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
In addition, some papers on AR and MR visualization already were pubished at the Web3D conferences. Here, e.g. occlusions, shadows and lighting in MR scenes were discussed in the context of X3D:&lt;br /&gt;
* [http://www.web3d.org/x3d/wiki/images/6/63/07X3DforMR.pdf MR-Paper-2007]&lt;br /&gt;
* [http://www.web3d.org/x3d/wiki/images/0/0d/07X3DforMR_slides.pdf MR-Paper-Slides]&lt;br /&gt;
* [http://www.web3d.org/x3d/wiki/images/0/0a/08PRTforX3D.pdf MR-Paper-2008]&lt;br /&gt;
There are moreover some more ideas on realistic MR rendering in X3D outlined in chapter 6 (p. 163 ff.), and especially in section 6.4, in the following PhD thesis (by Y. Jung): [http://tuprints.ulb.tu-darmstadt.de/2489/ PDF].&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
The screenshots below show several issues in MR visualization.&lt;br /&gt;
From top left to bottom right: (a) real image of a room; (b) real scene augmented with virtual character (note that the character appears to be before the table); (c) augmentation with additional occlusion handling (note that the character still seems to float on the floor); (d) augmentation with occlusion and shadows (applied via differential rendering).&lt;br /&gt;
&lt;br /&gt;
[[image:Kaiser140.png|600px|MR visualization]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
In the following, an example for achieving occlusion effects between real and virtual objects in AR/MR scenes is shown, given that the (real) 3D object, for which occlusions should be handled, already exist as 3D model (given as Shape in this example). Here, the invisible ghosting objects (denoting real scene geometry) are simply created by rendering them ''before'' the virtual objects (by setting the Appearance node's &amp;quot;sortKey&amp;quot; field to '-1') without writing any color values to the framebuffer (via the ColorMaskMode node) to initially stamp out the depth buffer.&lt;br /&gt;
&lt;br /&gt;
  &amp;lt;Shape&amp;gt;&lt;br /&gt;
    &amp;lt;Appearance sortKey='-1'&amp;gt;&lt;br /&gt;
      &amp;lt;ColorMaskMode maskR='false' maskG='false' maskB='false' maskA='false'/&amp;gt;&lt;br /&gt;
    &amp;lt;/Appearance&amp;gt;&lt;br /&gt;
    ...&lt;br /&gt;
  &amp;lt;/Shape&amp;gt;&lt;br /&gt;
&lt;br /&gt;
To set the camera's image in the background we use the aforementioned PolygonBackground node. By setting its field &amp;quot;fixedImageSize&amp;quot; the aspect ratio of the image can be defined. Depending on how you want the background image fit into the window, you need to set the mode field to 'VERTICAL' or 'HORIZONTAL'.&lt;br /&gt;
&lt;br /&gt;
  &amp;lt;PolygonBackground fixedImageSize='640,480' mode='VERTICAL'&amp;gt;&lt;br /&gt;
    &amp;lt;Appearance&amp;gt;&lt;br /&gt;
      &amp;lt;PixelTexture2D DEF='tex'/&amp;gt;&lt;br /&gt;
    &amp;lt;/Appearance&amp;gt;&lt;br /&gt;
  &amp;lt;/PolygonBackground&amp;gt; &lt;br /&gt;
&lt;br /&gt;
As mentioned above, more on that can be found in the corresponding tutorials, e.g. [http://doc.instantreality.org/tutorial/marker-tracking/ here].&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
== Korean Chapter ==&lt;br /&gt;
&lt;br /&gt;
The Korea chapter has been keenly interested in the standardization of augmented reality in many aspects including the AR based contents.  This is especially due to the recent world-wide appearances of mobile AR services and realization (from both academia and industry) of the definite need for exchanging service contents on different platforms. &lt;br /&gt;
&lt;br /&gt;
Three main proposals have been made within the Korean chapter, by: (1) Gerard Kim from Korea University (also representing KIST), (2) Gun A. Lee (formerly with ETRI, now with HITLabNZ), and (3) Woontack Woo of Gwangju Inst. of Science and Tech.   &lt;br /&gt;
&lt;br /&gt;
Here, we briefly describe each proposal and provide links to documents with more detailed descriptions.  These short summararies also try to highlight their distinctions with regards to other proposals, but not in the critical sense, but as a way to suggest alternatives.&lt;br /&gt;
&lt;br /&gt;
(1) Gerry Kim's proposal can be highlighted by the following features:&lt;br /&gt;
&lt;br /&gt;
- Extension of existing X3D &amp;quot;sensors&amp;quot; and formalisms to represent physical objects serving as proxies for virtual objects&lt;br /&gt;
&lt;br /&gt;
- The physical objects and virtual objects are tied using the &amp;quot;routes&amp;quot; (e.g. virtual objects' parent coordinate system being set to that of the corresponding physical object).&lt;br /&gt;
&lt;br /&gt;
- Below shows an example construct which is a simple extension of the &amp;quot;VisibilitySensor&amp;quot; attached to a marker.  The rough semantic would be to attach a sphere to a marker when visible.  The visibilty would be determined by the browser using a particular tracker.  In this simple case, a simple marke description is given through the &amp;quot;marker&amp;quot; node.&lt;br /&gt;
&lt;br /&gt;
 &amp;lt;Scene&amp;gt;&lt;br /&gt;
 &amp;lt;Group&amp;gt;&lt;br /&gt;
  &amp;lt;Marker DEF = “HIRO” enable “TRUE” filename=”C:\hiro.patt”/&amp;gt;&lt;br /&gt;
  &amp;lt;VisibilitySensor DEF='Visibility' description='activate if seen' enabled=”TRUE”/&amp;gt;&lt;br /&gt;
  &amp;lt;Transform DEF='BALL'&amp;gt;&lt;br /&gt;
   &amp;lt;Shape&amp;gt;&lt;br /&gt;
    &amp;lt;Appearance&amp;gt;&lt;br /&gt;
     &amp;lt;Material/&amp;gt;&lt;br /&gt;
    &amp;lt;/Appearance&amp;gt;&lt;br /&gt;
     &amp;lt;Sphere/&amp;gt;&lt;br /&gt;
    &amp;lt;/Shape&amp;gt;&lt;br /&gt;
  &amp;lt;/Transform&amp;gt;&lt;br /&gt;
 &amp;lt;/Group&amp;gt;&lt;br /&gt;
 &amp;lt;ROUTE fromNode=’Vsibility’ fromField='visible' toNode=’BALL’ toField=’visible’ /&amp;gt; &lt;br /&gt;
 &amp;lt;/Scene&amp;gt;&lt;br /&gt;
&lt;br /&gt;
- Different types of sensors can be newly defined or old ones extended to describe various AR contents.  These include proximity sensors, range sensors, etc.&lt;br /&gt;
&lt;br /&gt;
- Different physical object description will be needed at the right level of abstraction (such as the &amp;quot;marker&amp;quot; node in the above example).  These include those for image patch, 3D object, GPS location, natural features (e.g. points, lines), and etc.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
(2) Gun Lee's proposal&lt;br /&gt;
&lt;br /&gt;
- Extension of the TextureBackground and Movie texture node to handle video background for video see-through AR implementation.&lt;br /&gt;
&lt;br /&gt;
- Introduction of a node called &amp;quot;LiveCam&amp;quot; representing the video capture or vision based sensing in a video see-through AR implementation.&lt;br /&gt;
&lt;br /&gt;
- The video background would be routed from the &amp;quot;LiveCam&amp;quot; node and be supplied with the video image and/or camera parameters.&lt;br /&gt;
&lt;br /&gt;
- Extension of the virtual view point to accomodate more detailed camera parameters and to be set according to the parameters of the &amp;quot;LiveCam&amp;quot;.  &lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
(3) Woo's proposal&lt;br /&gt;
&lt;br /&gt;
- Woo proposes to use XML, as meta descriptors, with existing standards (e.g. X3D, Collada, etc.) for describing the augmentation information themselves.&lt;br /&gt;
&lt;br /&gt;
- As for the context (condition) for augmentation, a clear specification of &amp;quot;5W&amp;quot; approach is proposed: namely who, when, where, what and how.&lt;br /&gt;
&lt;br /&gt;
- &amp;quot;who&amp;quot; part specifies the owner/author of the contents.&lt;br /&gt;
&lt;br /&gt;
- &amp;quot;when&amp;quot; part specifies content creation time.&lt;br /&gt;
&lt;br /&gt;
- &amp;quot;where&amp;quot; part specifies the location of the physical object to which an augmentation is attached.&lt;br /&gt;
&lt;br /&gt;
- &amp;quot;what&amp;quot; part specifies the what is to be augmented content (augmentation information).&lt;br /&gt;
&lt;br /&gt;
- &amp;quot;how&amp;quot; part specifies dynamic part (behavior) of the content. &lt;br /&gt;
&lt;br /&gt;
* [http://dxp.korea.ac.kr/AR_standards/AR_standards.zip A zipped files containing various Korean proposals].&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
== X3D Earth Working Group ==&lt;br /&gt;
&lt;br /&gt;
The X3D Earth Working Group has expanded and refined a proposal by Dr. Myeong Won Lee for a new &lt;br /&gt;
[http://www.web3d.org/membership/login/memberwiki/index.php/X3D_v3.3_Specification_Changes#GpsSensor_node GpsSensor node].&lt;br /&gt;
Due to several overlapping technical issues, the group has asked to collaborate with the Augmented Reality group on final design for this node.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
= Participants =&lt;br /&gt;
&lt;br /&gt;
* Anita Havele&lt;br /&gt;
* Damon Hernandez&lt;br /&gt;
* Don Brutzman&lt;br /&gt;
* Gerard J. Kim&lt;br /&gt;
* Gun Lee&lt;br /&gt;
* Len Daly, Daly Realism&lt;br /&gt;
* Timo Engelke&lt;br /&gt;
* Sabine Webel&lt;br /&gt;
* Yvonne Jung&lt;br /&gt;
&lt;br /&gt;
= References =&lt;br /&gt;
&lt;br /&gt;
TODO&lt;br /&gt;
&lt;br /&gt;
= Participation and Liaisons =&lt;br /&gt;
&lt;br /&gt;
* TODO describe Christine Perry's group on AR Standardization&lt;br /&gt;
&lt;br /&gt;
* Other partnerships can also be considered as appropriate.&lt;br /&gt;
&lt;br /&gt;
Of interest is that Web3D Consortium Intellectual Property Rights (IPR) Policy insists on open, royalty free (RF) specifications.  These coexist effectively with the Web Architecture and many different business models.&lt;/div&gt;</summary>
		<author><name>Yjung</name></author>	</entry>

	<entry>
		<id>https://old.web3d.org/wiki/index.php?title=X3D_and_Augmented_Reality&amp;diff=3765</id>
		<title>X3D and Augmented Reality</title>
		<link rel="alternate" type="text/html" href="https://old.web3d.org/wiki/index.php?title=X3D_and_Augmented_Reality&amp;diff=3765"/>
				<updated>2011-05-25T08:28:30Z</updated>
		
		<summary type="html">&lt;p&gt;Yjung: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;== History and Background Information ==&lt;br /&gt;
Web3D Consortium formed a special interest group on AR initiatives in July 2009 worked to help create the AR Technology Road Map.&lt;br /&gt;
&lt;br /&gt;
Several Web3D Consortium member projects showcase the feasibility of AR in X3D, particularly [http://www.x3dom.org X3DOM] open source produced by [http://www.igd.fhg.de/www/igd-a4 Fraunhofer IGD]. &lt;br /&gt;
&lt;br /&gt;
* X3DOM can serve as an out of the box, standards-based solution for AR developers.&lt;br /&gt;
&lt;br /&gt;
* X3D and X3DOM continue to improve scripting options for X3D with HTML5 and this has good promise for AR applications. &lt;br /&gt;
&lt;br /&gt;
* X3DOM is being considered for potential standardization in a Mobile and/or Augmented Reality (AR) profile for X3D.&lt;br /&gt;
&lt;br /&gt;
Our [http://www.web3d.kr Web3D Korea Chapter members] from ETRI are working on Mixed Reality visualization in X3D.&lt;br /&gt;
&lt;br /&gt;
* The Consortium as been working closely within W3C HTML5 WG to align our standards for 3D visualization on the Web. &lt;br /&gt;
&lt;br /&gt;
Additional details are available at: &lt;br /&gt;
&lt;br /&gt;
* [http://www.web3d.org/x3d/wiki/index.php/X3D_and_HTML5 X3D and HTML5]&lt;br /&gt;
&lt;br /&gt;
* [http://www.web3d.org/x3d/wiki/index.php/X3D_and_HTML_FAQ X3D and HTML FAQ]&lt;br /&gt;
&lt;br /&gt;
* [http://www.x3dom.org X3DOM]&lt;br /&gt;
&lt;br /&gt;
Web3D Consortium is also engaged with the [http://www.arstandards.org AR standards], [http://opengeospacial.org  OGC], [http://www.khronos.org Khronos] and [http://w3c.org W3C] organizations for applying and adapting X3D.&lt;br /&gt;
&lt;br /&gt;
* Web3D Consortium will be starting an AR working group in March 2011 to develop a roadmap for AR standardization. We encourage your participation. &lt;br /&gt;
&lt;br /&gt;
* The technology discussions and meetings will be conducted publicly so that the widest possible comment and reaction can be considered from the Community. Feedback from this community will help X3D quickly and stably adopt new technologies to provide ongoing archival value for all 3D graphics.&lt;br /&gt;
&lt;br /&gt;
* Meanwhile Web3D Members still retain important membership rights of proposing significant new technology and considering patented technologies within a &amp;quot;safe haven&amp;quot; prior to public release.&lt;br /&gt;
&lt;br /&gt;
[http://www.web3d.org/x3d/wiki/images/3/32/X3dProfilePossibilitiesMobileHtml5AR.2010June29.pdf X3D Mobile Profile slideset]&lt;br /&gt;
from last summer's Mobile X3D ISO Workshop has also been&lt;br /&gt;
linked: how Mobile, HTML5 and possibly Augmented Reality (AR) components&lt;br /&gt;
can be aligned together.&lt;br /&gt;
&lt;br /&gt;
Many new Web3D capabilities are becoming available.  There has been no better time to discuss X3D technologies and also join the [http://web3d.org/membersip/join Web3D Consortium]. Get involved in these early discussions to create the AR Technology Road Map. We look forward to your participation.&lt;br /&gt;
&lt;br /&gt;
= Calendar =&lt;br /&gt;
&lt;br /&gt;
[http://www.perey.com/ARStandards/third-international-ar-standards-meeting/ Third International AR Standards Meeting, June 15-17, Taichung, Taiwan]&lt;br /&gt;
&lt;br /&gt;
[http://web3d2011.org/ Augmented/Mixed Reality Workshop at Web3D Conference, 10:50AM-12:10PM Local Time,  June 21, Paris, France]&lt;br /&gt;
&lt;br /&gt;
[http://www.siggraph.org/s2011/for_attendees/birds-feather Augmented and Mixed Reality Web3D BOF at SIGGRAPH 2011, August 2011, Vancouver, Canada]&lt;br /&gt;
&lt;br /&gt;
SC24 Augmented and Mixed Reality Study Group Meeting  @ SC24 Plenary and Working Group Meetings, August 21 2011, Rapid City, South Dakota, USA&lt;br /&gt;
&lt;br /&gt;
= Draft Charter =&lt;br /&gt;
&lt;br /&gt;
Adding points to this section will be the subject of the next meetings.&lt;br /&gt;
&lt;br /&gt;
== Overview ==&lt;br /&gt;
The Augmented Reality (AR) Working Group focuses on utilizing and extending X3D capabilities to support augmented reality (AR) and mixed reality (MR) applications.&lt;br /&gt;
&lt;br /&gt;
''Discussion.''  These X3D AR discussions were initially held as part of a special interest group. Now that we have determined sufficient exists to modify the X3D Specification, this effort is moving forward by forming the X3D AR Working Group.&lt;br /&gt;
&lt;br /&gt;
== Goals ==&lt;br /&gt;
Planned goals of AR WG include:&lt;br /&gt;
* Collect requirements and describe typical use cases for using X3D in AR/MR applications&lt;br /&gt;
* Develop and extend functions and nodes for X3D specification required to support AR/MR applications&lt;br /&gt;
* Produce sample AR/MR applications using X3D to demonstrate how this functionality can work correctly&lt;br /&gt;
* Produce and X3D propose Specification changes, possibly include an AR Component and/or a Mobile Profile&lt;br /&gt;
&lt;br /&gt;
== Tasks ==&lt;br /&gt;
&lt;br /&gt;
TODO these tasks should probably be rewritten as separate distinct efforts that support the Goals&lt;br /&gt;
&lt;br /&gt;
* Investigate state-of-the-art technologies related to X3D and AR/MR&lt;br /&gt;
* Develop and extend X3D specification to support AR/MR applications&lt;br /&gt;
* Promote X3D in AR/MR field by developing and demonstrating use cases&lt;br /&gt;
** videos are particularly compelling and more informative that a 3D demo because they show the use of AR/MR in the context of the real world&lt;br /&gt;
&lt;br /&gt;
== Deliverables and Timeline ==&lt;br /&gt;
* Use cases of X3D for AR/MR application - summer 2011&lt;br /&gt;
* Requirements for X3D to support AR/MR applications - summer 2011&lt;br /&gt;
* Proposed new/extended functions and nodes for X3D specification&lt;br /&gt;
** once use cases and requirements are stated, can compare existing and new proposals for X3D functionality&lt;br /&gt;
* Sample AR/MR applications with X3D&lt;br /&gt;
** these will be produced in support of each proposal&lt;br /&gt;
&lt;br /&gt;
== Meetings ==&lt;br /&gt;
Our twice-monthly teleconference for X3D and Augmented Reality is usually&lt;br /&gt;
* Together with Korea Chapter Meeting:&lt;br /&gt;
1710-1810 (Pacific time)/2010-2110 (Eastern time)  on 1st Wednesday, which is 0910-1010 (Korea time) on 1st Thursday.&lt;br /&gt;
* AR only Meeting:&lt;br /&gt;
1710-1810 (Pacific time)/2010-2110 (Eastern time)  on 3rd Wednesday, which is 0910-1010 (Korea time) on 3rd Thursday.&lt;br /&gt;
&lt;br /&gt;
Participation is open to everyone via the Web3D teleconference line.  Non-members can [mailto:anita.havele@web3d.org?subject=AR%20teleconference%20request request access information] for this call, or [http://web3d.org/membership/join Join Web3D]!&lt;br /&gt;
&lt;br /&gt;
Our next public teleconference is __________ 2011.&lt;br /&gt;
&lt;br /&gt;
Meeting agenda:&lt;br /&gt;
TBA through the mailing list.&lt;br /&gt;
&lt;br /&gt;
Discussions occur on the [mailto:x3d-public@web3d.org x3d-public@web3d.org mailing list].  If the email traffic becomes very busy then we can create a separate email list.&lt;br /&gt;
&lt;br /&gt;
Meeting minutes are also distributed on the X3D-Public mailing list and [http://web3d.org/pipermail/x3d-public_web3d.org/ archived online].&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
= Existing Proposals =&lt;br /&gt;
&lt;br /&gt;
== Instant Reality ==&lt;br /&gt;
&lt;br /&gt;
Instant Reality is a Mixed Reality framework developed and maintained by Fraunhofer IGD, which uses X3D as application description language and which thus also provides corresponding nodes and concepts for developing AR/MR applications. The full node documentation can be found on [http://doc.instantreality.org/documentation/ IR-Docs].&lt;br /&gt;
&lt;br /&gt;
There also exist several tutorials on vision-based tracking with Instant Reality, which e.g. describe specific nodes like the IOSensor node for retrieving the camera streams and the tracking results of the vision subsystem, or which discuss the new PolygonBackground node for displaying the camera images behind the virtual objects as well as some useful camera extensions to the X3D Viewpoint node, etc.: [http://doc.instantreality.org/tutorial/ Tracking-Tutorial].&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
In addition, some papers on AR and MR visualization already were pubished at the Web3D conferences. Here, e.g. occlusions, shadows and lighting in MR scenes were discussed in the context of X3D:&lt;br /&gt;
* [http://www.web3d.org/x3d/wiki/images/6/63/07X3DforMR.pdf MR-Paper-2007]&lt;br /&gt;
* [http://www.web3d.org/x3d/wiki/images/0/0d/07X3DforMR_slides.pdf MR-Paper-Slides]&lt;br /&gt;
* [http://www.web3d.org/x3d/wiki/images/0/0a/08PRTforX3D.pdf MR-Paper-2008]&lt;br /&gt;
There are moreover some more ideas on realistic MR rendering in X3D outlined in chapter 6 (p. 163 ff.), and especially in section 6.4, in the following PhD thesis (by Y. Jung): [http://tuprints.ulb.tu-darmstadt.de/2489/ PDF].&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
The screenshots below show several issues in MR visualization.&lt;br /&gt;
From top left to bottom right: (a) real image of a room; (b) real scene augmented with virtual character (note that the character appears to be before the table); (c) augmentation with additional occlusion handling (note that the character still seems to float on the floor); (d) augmentation with occlusion and shadows (applied via differential rendering).&lt;br /&gt;
&lt;br /&gt;
[[image:Kaiser140.png|600px|MR visualization]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
In the following, an example for achieving occlusion effects between real and virtual objects in AR/MR scenes is shown, given that the (real) 3D object, for which occlusions should be handled, already exist as 3D model (given as Shape in this example). Here, the invisible ghosting objects (denoting real gometry) are simply created by rendering them ''before'' the virtual objects (by setting the Appearance node's &amp;quot;sortKey&amp;quot; field to '-1') without writing any color values to the framebuffer (via the ColorMaskMode node) to initially stamp out the depth buffer.&lt;br /&gt;
&lt;br /&gt;
  &amp;lt;Shape&amp;gt;&lt;br /&gt;
    &amp;lt;Appearance sortKey='-1'&amp;gt;&lt;br /&gt;
      &amp;lt;ColorMaskMode maskR='false' maskG='false' maskB='false' maskA='false'/&amp;gt;&lt;br /&gt;
    &amp;lt;/Appearance&amp;gt;&lt;br /&gt;
    ...&lt;br /&gt;
  &amp;lt;/Shape&amp;gt;&lt;br /&gt;
&lt;br /&gt;
To set the camera's image in the background we use the aforementioned PolygonBackground node. By setting its field &amp;quot;fixedImageSize&amp;quot; the aspect ratio of the image can be defined. Depending on how you want the background image fit into the window, you need to set the mode field to 'VERTICAL' or 'HORIZONTAL'.&lt;br /&gt;
&lt;br /&gt;
  &amp;lt;PolygonBackground fixedImageSize='640,480' mode='VERTICAL'&amp;gt;&lt;br /&gt;
    &amp;lt;Appearance&amp;gt;&lt;br /&gt;
      &amp;lt;PixelTexture2D DEF='tex'/&amp;gt;&lt;br /&gt;
    &amp;lt;/Appearance&amp;gt;&lt;br /&gt;
  &amp;lt;/PolygonBackground&amp;gt; &lt;br /&gt;
&lt;br /&gt;
As mentioned above, more on that can be found in the corresponding tutorials, e.g. [http://doc.instantreality.org/tutorial/marker-tracking/ here].&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
== Korean Chapter ==&lt;br /&gt;
&lt;br /&gt;
The Korea chapter has been keenly interested in the standardization of augmented reality in many aspects including the AR based contents.  This is especially due to the recent world-wide appearances of mobile AR services and realization (from both academia and industry) of the definite need for exchanging service contents on different platforms. &lt;br /&gt;
&lt;br /&gt;
Three main proposals have been made within the Korean chapter, by: (1) Gerard Kim from Korea University (also representing KIST), (2) Gun A. Lee (formerly with ETRI, now with HITLabNZ), and (3) Woontack Woo of Gwangju Inst. of Science and Tech.   &lt;br /&gt;
&lt;br /&gt;
Here, we briefly describe each proposal and provide links to documents with more detailed descriptions.  These short summararies also try to highlight their distinctions with regards to other proposals, but not in the critical sense, but as a way to suggest alternatives.&lt;br /&gt;
&lt;br /&gt;
(1) Gerry Kim's proposal can be highlighted by the following features:&lt;br /&gt;
&lt;br /&gt;
- Extension of existing X3D &amp;quot;sensors&amp;quot; and formalisms to represent physical objects serving as proxies for virtual objects&lt;br /&gt;
&lt;br /&gt;
- The physical objects and virtual objects are tied using the &amp;quot;routes&amp;quot; (e.g. virtual objects' parent coordinate system being set to that of the corresponding physical object).&lt;br /&gt;
&lt;br /&gt;
- Below shows an example construct which is a simple extension of the &amp;quot;VisibilitySensor&amp;quot; attached to a marker.  The rough semantic would be to attach a sphere to a marker when visible.  The visibilty would be determined by the browser using a particular tracker.  In this simple case, a simple marke description is given through the &amp;quot;marker&amp;quot; node.&lt;br /&gt;
&lt;br /&gt;
 &amp;lt;Scene&amp;gt;&lt;br /&gt;
 &amp;lt;Group&amp;gt;&lt;br /&gt;
  &amp;lt;Marker DEF = “HIRO” enable “TRUE” filename=”C:\hiro.patt”/&amp;gt;&lt;br /&gt;
  &amp;lt;VisibilitySensor DEF='Visibility' description='activate if seen' enabled=”TRUE”/&amp;gt;&lt;br /&gt;
  &amp;lt;Transform DEF='BALL'&amp;gt;&lt;br /&gt;
   &amp;lt;Shape&amp;gt;&lt;br /&gt;
    &amp;lt;Appearance&amp;gt;&lt;br /&gt;
     &amp;lt;Material/&amp;gt;&lt;br /&gt;
    &amp;lt;/Appearance&amp;gt;&lt;br /&gt;
     &amp;lt;Sphere/&amp;gt;&lt;br /&gt;
    &amp;lt;/Shape&amp;gt;&lt;br /&gt;
  &amp;lt;/Transform&amp;gt;&lt;br /&gt;
 &amp;lt;/Group&amp;gt;&lt;br /&gt;
 &amp;lt;ROUTE fromNode=’Vsibility’ fromField='visible' toNode=’BALL’ toField=’visible’ /&amp;gt; &lt;br /&gt;
 &amp;lt;/Scene&amp;gt;&lt;br /&gt;
&lt;br /&gt;
- Different types of sensors can be newly defined or old ones extended to describe various AR contents.  These include proximity sensors, range sensors, etc.&lt;br /&gt;
&lt;br /&gt;
- Different physical object description will be needed at the right level of abstraction (such as the &amp;quot;marker&amp;quot; node in the above example).  These include those for image patch, 3D object, GPS location, natural features (e.g. points, lines), and etc.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
(2) Gun Lee's proposal&lt;br /&gt;
&lt;br /&gt;
- Extension of the TextureBackground and Movie texture node to handle video background for video see-through AR implementation.&lt;br /&gt;
&lt;br /&gt;
- Introduction of a node called &amp;quot;LiveCam&amp;quot; representing the video capture or vision based sensing in a video see-through AR implementation.&lt;br /&gt;
&lt;br /&gt;
- The video background would be routed from the &amp;quot;LiveCam&amp;quot; node and be supplied with the video image and/or camera parameters.&lt;br /&gt;
&lt;br /&gt;
- Extension of the virtual view point to accomodate more detailed camera parameters and to be set according to the parameters of the &amp;quot;LiveCam&amp;quot;.  &lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
(3) Woo's proposal&lt;br /&gt;
&lt;br /&gt;
- Woo proposes to use XML, as meta descriptors, with existing standards (e.g. X3D, Collada, etc.) for describing the augmentation information themselves.&lt;br /&gt;
&lt;br /&gt;
- As for the context (condition) for augmentation, a clear specification of &amp;quot;5W&amp;quot; approach is proposed: namely who, when, where, what and how.&lt;br /&gt;
&lt;br /&gt;
- &amp;quot;who&amp;quot; part specifies the owner/author of the contents.&lt;br /&gt;
&lt;br /&gt;
- &amp;quot;when&amp;quot; part specifies content creation time.&lt;br /&gt;
&lt;br /&gt;
- &amp;quot;where&amp;quot; part specifies the location of the physical object to which an augmentation is attached.&lt;br /&gt;
&lt;br /&gt;
- &amp;quot;what&amp;quot; part specifies the what is to be augmented content (augmentation information).&lt;br /&gt;
&lt;br /&gt;
- &amp;quot;how&amp;quot; part specifies dynamic part (behavior) of the content. &lt;br /&gt;
&lt;br /&gt;
* [http://dxp.korea.ac.kr/AR_standards/AR_standards.zip A zipped files containing various Korean proposals].&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
== X3D Earth Working Group ==&lt;br /&gt;
&lt;br /&gt;
The X3D Earth Working Group has expanded and refined a proposal by Dr. Myeong Won Lee for a new &lt;br /&gt;
[http://www.web3d.org/membership/login/memberwiki/index.php/X3D_v3.3_Specification_Changes#GpsSensor_node GpsSensor node].&lt;br /&gt;
Due to several overlapping technical issues, the group has asked to collaborate with the Augmented Reality group on final design for this node.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
= Participants =&lt;br /&gt;
&lt;br /&gt;
* Anita Havele&lt;br /&gt;
* Damon Hernandez&lt;br /&gt;
* Don Brutzman&lt;br /&gt;
* Gerard J. Kim&lt;br /&gt;
* Gun Lee&lt;br /&gt;
* Len Daly, Daly Realism&lt;br /&gt;
* Timo Engelke&lt;br /&gt;
* Sabine Webel&lt;br /&gt;
* Yvonne Jung&lt;br /&gt;
&lt;br /&gt;
= References =&lt;br /&gt;
&lt;br /&gt;
TODO&lt;br /&gt;
&lt;br /&gt;
= Participation and Liaisons =&lt;br /&gt;
&lt;br /&gt;
* TODO describe Christine Perry's group on AR Standardization&lt;br /&gt;
&lt;br /&gt;
* Other partnerships can also be considered as appropriate.&lt;br /&gt;
&lt;br /&gt;
Of interest is that Web3D Consortium Intellectual Property Rights (IPR) Policy insists on open, royalty free (RF) specifications.  These coexist effectively with the Web Architecture and many different business models.&lt;/div&gt;</summary>
		<author><name>Yjung</name></author>	</entry>

	<entry>
		<id>https://old.web3d.org/wiki/index.php?title=X3D_and_Augmented_Reality&amp;diff=3753</id>
		<title>X3D and Augmented Reality</title>
		<link rel="alternate" type="text/html" href="https://old.web3d.org/wiki/index.php?title=X3D_and_Augmented_Reality&amp;diff=3753"/>
				<updated>2011-05-22T19:14:34Z</updated>
		
		<summary type="html">&lt;p&gt;Yjung: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;== History and Background Information ==&lt;br /&gt;
Web3D Consortium formed a special interest group on AR initiatives in July 2009 worked to help create the AR Technology Road Map.&lt;br /&gt;
&lt;br /&gt;
Several Web3D Consortium member projects showcase the feasibility of AR in X3D, particularly [http://www.x3dom.org X3DOM] open source produced by [http://www.igd.fhg.de/www/igd-a4 Fraunhofer IGD]. &lt;br /&gt;
&lt;br /&gt;
* X3DOM can serve as an out of the box, standards-based solution for AR developers.&lt;br /&gt;
&lt;br /&gt;
* X3D and X3DOM continue to improve scripting options for X3D with HTML5 and this has good promise for AR applications. &lt;br /&gt;
&lt;br /&gt;
* X3DOM is being considered for potential standardization in a Mobile and/or Augmented Reality (AR) profile for X3D.&lt;br /&gt;
&lt;br /&gt;
Our [http://www.web3d.kr Web3D Korea Chapter members] from ETRI are working on Mixed Reality visualization in X3D.&lt;br /&gt;
&lt;br /&gt;
* The Consortium as been working closely within W3C HTML5 WG to align our standards for 3D visualization on the Web. &lt;br /&gt;
&lt;br /&gt;
Additional details are available at: &lt;br /&gt;
&lt;br /&gt;
* [http://www.web3d.org/x3d/wiki/index.php/X3D_and_HTML5 X3D and HTML5]&lt;br /&gt;
&lt;br /&gt;
* [http://www.web3d.org/x3d/wiki/index.php/X3D_and_HTML_FAQ X3D and HTML FAQ]&lt;br /&gt;
&lt;br /&gt;
* [http://www.x3dom.org X3DOM]&lt;br /&gt;
&lt;br /&gt;
Web3D Consortium is also engaged with the [http://www.arstandards.org AR standards], [http://opengeospacial.org  OGC], [http://www.khronos.org Khronos] and [http://w3c.org W3C] organizations for applying and adapting X3D.&lt;br /&gt;
&lt;br /&gt;
* Web3D Consortium will be starting an AR working group in March 2011 to develop a roadmap for AR standardization. We encourage your participation. &lt;br /&gt;
&lt;br /&gt;
* The technology discussions and meetings will be conducted publicly so that the widest possible comment and reaction can be considered from the Community. Feedback from this community will help X3D quickly and stably adopt new technologies to provide ongoing archival value for all 3D graphics.&lt;br /&gt;
&lt;br /&gt;
* Meanwhile Web3D Members still retain important membership rights of proposing significant new technology and considering patented technologies within a &amp;quot;safe haven&amp;quot; prior to public release.&lt;br /&gt;
&lt;br /&gt;
[http://www.web3d.org/x3d/wiki/images/3/32/X3dProfilePossibilitiesMobileHtml5AR.2010June29.pdf X3D Mobile Profile slideset]&lt;br /&gt;
from last summer's Mobile X3D ISO Workshop has also been&lt;br /&gt;
linked: how Mobile, HTML5 and possibly Augmented Reality (AR) components&lt;br /&gt;
can be aligned together.&lt;br /&gt;
&lt;br /&gt;
Many new Web3D capabilities are becoming available.  There has been no better time to discuss X3D technologies and also join the [http://web3d.org/membersip/join Web3D Consortium]. Get involved in these early discussions to create the AR Technology Road Map. We look forward to your participation.&lt;br /&gt;
&lt;br /&gt;
= Calendar =&lt;br /&gt;
&lt;br /&gt;
[http://www.perey.com/ARStandards/third-international-ar-standards-meeting/ Third International AR Standards Meeting, June 15-17, Taichung, Taiwan]&lt;br /&gt;
&lt;br /&gt;
[http://web3d2011.org/ Augmented/Mixed Reality Workshop at Web3D Conference, 10:50AM-12:10PM Local Time,  June 21, Paris, France]&lt;br /&gt;
&lt;br /&gt;
[http://www.siggraph.org/s2011/for_attendees/birds-feather Augmented and Mixed Reality Web3D BOF at SIGGRAPH 2011, August 2011, Vancouver, Canada]&lt;br /&gt;
&lt;br /&gt;
SC24 Augmented and Mixed Reality Study Group Meeting  @ SC24 Plenary and Working Group Meetings, August 21 2011, Rapid City, South Dakota, USA&lt;br /&gt;
&lt;br /&gt;
= Draft Charter =&lt;br /&gt;
&lt;br /&gt;
Adding points to this section will be the subject of the next meetings.&lt;br /&gt;
&lt;br /&gt;
== Overview ==&lt;br /&gt;
The Augmented Reality (AR) Working Group focuses on utilizing and extending X3D capabilities to support augmented reality (AR) and mixed reality (MR) applications.&lt;br /&gt;
&lt;br /&gt;
''Discussion.''  These X3D AR discussions were initially held as part of a special interest group. Now that we have determined sufficient exists to modify the X3D Specification, this effort is moving forward by forming the X3D AR Working Group.&lt;br /&gt;
&lt;br /&gt;
== Goals ==&lt;br /&gt;
Planned goals of AR WG include:&lt;br /&gt;
* Collect requirements and describe typical use cases for using X3D in AR/MR applications&lt;br /&gt;
* Develop and extend functions and nodes for X3D specification required to support AR/MR applications&lt;br /&gt;
* Produce sample AR/MR applications using X3D to demonstrate how this functionality can work correctly&lt;br /&gt;
* Produce and X3D propose Specification changes, possibly include an AR Component and/or a Mobile Profile&lt;br /&gt;
&lt;br /&gt;
== Tasks ==&lt;br /&gt;
&lt;br /&gt;
TODO these tasks should probably be rewritten as separate distinct efforts that support the Goals&lt;br /&gt;
&lt;br /&gt;
* Investigate state-of-the-art technologies related to X3D and AR/MR&lt;br /&gt;
* Develop and extend X3D specification to support AR/MR applications&lt;br /&gt;
* Promote X3D in AR/MR field by developing and demonstrating use cases&lt;br /&gt;
** videos are particularly compelling and more informative that a 3D demo because they show the use of AR/MR in the context of the real world&lt;br /&gt;
&lt;br /&gt;
== Deliverables and Timeline ==&lt;br /&gt;
* Use cases of X3D for AR/MR application - summer 2011&lt;br /&gt;
* Requirements for X3D to support AR/MR applications - summer 2011&lt;br /&gt;
* Proposed new/extended functions and nodes for X3D specification&lt;br /&gt;
** once use cases and requirements are stated, can compare existing and new proposals for X3D functionality&lt;br /&gt;
* Sample AR/MR applications with X3D&lt;br /&gt;
** these will be produced in support of each proposal&lt;br /&gt;
&lt;br /&gt;
== Meetings ==&lt;br /&gt;
Our twice-monthly teleconference for X3D and Augmented Reality is usually&lt;br /&gt;
* Together with Korea Chapter Meeting:&lt;br /&gt;
1710-1810 (Pacific time)/2010-2110 (Eastern time)  on 1st Wednesday, which is 0910-1010 (Korea time) on 1st Thursday.&lt;br /&gt;
* AR only Meeting:&lt;br /&gt;
1710-1810 (Pacific time)/2010-2110 (Eastern time)  on 3rd Wednesday, which is 0910-1010 (Korea time) on 3rd Thursday.&lt;br /&gt;
&lt;br /&gt;
Participation is open to everyone via the Web3D teleconference line.  Non-members can [mailto:anita.havele@web3d.org?subject=AR%20teleconference%20request request access information] for this call, or [http://web3d.org/membership/join Join Web3D]!&lt;br /&gt;
&lt;br /&gt;
Our next public teleconference is __________ 2011.&lt;br /&gt;
&lt;br /&gt;
Meeting agenda:&lt;br /&gt;
TBA through the mailing list.&lt;br /&gt;
&lt;br /&gt;
Discussions occur on the [mailto:x3d-public@web3d.org x3d-public@web3d.org mailing list].  If the email traffic becomes very busy then we can create a separate email list.&lt;br /&gt;
&lt;br /&gt;
Meeting minutes are also distributed on the X3D-Public mailing list and [http://web3d.org/pipermail/x3d-public_web3d.org/ archived online].&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
= Existing Proposals =&lt;br /&gt;
&lt;br /&gt;
== Instant Reality ==&lt;br /&gt;
&lt;br /&gt;
Instant Reality is a Mixed Reality framework developed and maintained by Fraunhofer IGD, which uses X3D as application description language and which thus also provides corresponding nodes and concepts for developing AR/MR applications. The full node documentation can be found on [http://doc.instantreality.org/documentation/ IR-Docs].&lt;br /&gt;
&lt;br /&gt;
There also exist several tutorials on vision-based tracking with Instant Reality, which e.g. describe specific nodes like the IOSensor node for retrieving the camera streams and the tracking results of the vision subsystem, or which discuss the new PolygonBackground node for displaying the camera images behind the virtual objects as well as some useful camera extensions to the X3D Viewpoint node, etc.: [http://doc.instantreality.org/tutorial/ Tracking-Tutorial].&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
In addition, some papers on AR and MR visualization already were pubished at the Web3D conferences. Here, e.g. occlusions, shadows and lighting in MR scenes were discussed in the context of X3D:&lt;br /&gt;
* [http://www.web3d.org/x3d/wiki/images/6/63/07X3DforMR.pdf MR-Paper-2007]&lt;br /&gt;
* [http://www.web3d.org/x3d/wiki/images/0/0d/07X3DforMR_slides.pdf MR-Paper-Slides]&lt;br /&gt;
* [http://www.web3d.org/x3d/wiki/images/0/0a/08PRTforX3D.pdf MR-Paper-2008]&lt;br /&gt;
There are moreover some more ideas on realistic MR rendering in X3D outlined in chapter 6 (p. 163 ff.), and especially in section 6.4, in the following PhD thesis (by Y. Jung): [http://tuprints.ulb.tu-darmstadt.de/2489/ PDF].&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
The screenshots in the next link show several problems in MR visualization:&lt;br /&gt;
http://www.web3d.org/x3d/wiki/images/2/2f/Kaiser140.png&lt;br /&gt;
&lt;br /&gt;
From top left to bottom right: (a) real image of a room; (b) real scene augmented with virtual character (note that the character appears to be before the table); (c) augmentation with additional occlusion handling (note that the character still seems to float on the floor); (d) augmentation with occlusion and shadows (applied via differential rendering).&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
Below, an example for achieving occlusion effects between real and virtual objects in AR/MR scenes is shown, given that the (real) 3D object, for which occlusions should be handled, already exist as 3D model (given as Shape in this example). Here, the invisible ghosting objects (denoting real gometry) are simply created by rendering them ''before'' the virtual objects (by setting the Appearance node's &amp;quot;sortKey&amp;quot; field to '-1') without writing any color values to the framebuffer (via the ColorMaskMode node) to initially stamp out the depth buffer.&lt;br /&gt;
&lt;br /&gt;
  &amp;lt;Shape&amp;gt;&lt;br /&gt;
    &amp;lt;Appearance sortKey='-1'&amp;gt;&lt;br /&gt;
      &amp;lt;ColorMaskMode maskR='false' maskG='false' maskB='false' maskA='false'/&amp;gt;&lt;br /&gt;
    &amp;lt;/Appearance&amp;gt;&lt;br /&gt;
    ...&lt;br /&gt;
  &amp;lt;/Shape&amp;gt;&lt;br /&gt;
&lt;br /&gt;
To set the camera's image in the background we use the aforementioned PolygonBackground node. By setting its field &amp;quot;fixedImageSize&amp;quot; the aspect ratio of the image can be defined. Depending on how you want the background image fit into the window, you need to set the mode field to 'VERTICAL' or 'HORIZONTAL'.&lt;br /&gt;
&lt;br /&gt;
  &amp;lt;PolygonBackground fixedImageSize='640,480' mode='VERTICAL'&amp;gt;&lt;br /&gt;
    &amp;lt;Appearance&amp;gt;&lt;br /&gt;
      &amp;lt;PixelTexture2D DEF='tex'/&amp;gt;&lt;br /&gt;
    &amp;lt;/Appearance&amp;gt;&lt;br /&gt;
  &amp;lt;/PolygonBackground&amp;gt; &lt;br /&gt;
&lt;br /&gt;
As mentioned above, more on that can be found in the corresponding tutorials, e.g. [http://doc.instantreality.org/tutorial/marker-tracking/ here].&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
== Korean Chapter ==&lt;br /&gt;
&lt;br /&gt;
The Korea chapter has been keenly interested in the standardization of augmented reality in many aspects including the AR based contents.  This is especially due to the recent world-wide appearances of mobile AR services and realization (from both academia and industry) of the definite need for exchanging service contents on different platforms. &lt;br /&gt;
&lt;br /&gt;
Three main proposals have been made within the Korean chapter, by: (1) Gerard Kim from Korea University (also representing KIST), (2) Gun A. Lee (formerly with ETRI, now with HITLabNZ), and (3) Woontack Woo of Gwangju Inst. of Science and Tech.   &lt;br /&gt;
&lt;br /&gt;
Here, we briefly describe each proposal and provide links to documents with more detailed descriptions.  These short summararies also try to highlight their distinctions with regards to other proposals, but not in the critical sense, but as a way to suggest alternatives.&lt;br /&gt;
&lt;br /&gt;
(1) Gerry Kim's proposal can be highlighted by the following features:&lt;br /&gt;
&lt;br /&gt;
- Extension of existing X3D &amp;quot;sensors&amp;quot; and formalisms to represent physical objects serving as proxies for virtual objects&lt;br /&gt;
&lt;br /&gt;
- The physical objects and virtual objects are tied using the &amp;quot;routes&amp;quot; (e.g. virtual objects' parent coordinate system being set to that of the corresponding physical object).&lt;br /&gt;
&lt;br /&gt;
- Below shows an example construct which is a simple extension of the &amp;quot;VisibilitySensor&amp;quot; attached to a marker.  The rough semantic would be to attach a sphere to a marker when visible.  The visibilty would be determined by the browser using a particular tracker.  In this simple case, a simple marke description is given through the &amp;quot;marker&amp;quot; node.&lt;br /&gt;
&lt;br /&gt;
 &amp;lt;Scene&amp;gt;&lt;br /&gt;
 &amp;lt;Group&amp;gt;&lt;br /&gt;
  &amp;lt;Marker DEF = “HIRO” enable “TRUE” filename=”C:\hiro.patt”/&amp;gt;&lt;br /&gt;
  &amp;lt;VisibilitySensor DEF='Visibility' description='activate if seen' enabled=”TRUE”/&amp;gt;&lt;br /&gt;
  &amp;lt;Transform DEF='BALL'&amp;gt;&lt;br /&gt;
   &amp;lt;Shape&amp;gt;&lt;br /&gt;
    &amp;lt;Appearance&amp;gt;&lt;br /&gt;
     &amp;lt;Material/&amp;gt;&lt;br /&gt;
    &amp;lt;/Appearance&amp;gt;&lt;br /&gt;
     &amp;lt;Sphere/&amp;gt;&lt;br /&gt;
    &amp;lt;/Shape&amp;gt;&lt;br /&gt;
  &amp;lt;/Transform&amp;gt;&lt;br /&gt;
 &amp;lt;/Group&amp;gt;&lt;br /&gt;
 &amp;lt;ROUTE fromNode=’Vsibility’ fromField='visible' toNode=’BALL’ toField=’visible’ /&amp;gt; &lt;br /&gt;
 &amp;lt;/Scene&amp;gt;&lt;br /&gt;
&lt;br /&gt;
- Different types of sensors can be newly defined or old ones extended to describe various AR contents.  These include proximity sensors, range sensors, etc.&lt;br /&gt;
&lt;br /&gt;
- Different physical object description will be needed at the right level of abstraction (such as the &amp;quot;marker&amp;quot; node in the above example).  These include those for image patch, 3D object, GPS location, natural features (e.g. points, lines), and etc.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
(2) Gun Lee's proposal&lt;br /&gt;
&lt;br /&gt;
- Extension of the TextureBackground and Movie texture node to handle video background for video see-through AR implementation.&lt;br /&gt;
&lt;br /&gt;
- Introduction of a node called &amp;quot;LiveCam&amp;quot; representing the video capture or vision based sensing in a video see-through AR implementation.&lt;br /&gt;
&lt;br /&gt;
- The video background would be routed from the &amp;quot;LiveCam&amp;quot; node and be supplied with the video image and/or camera parameters.&lt;br /&gt;
&lt;br /&gt;
- Extension of the virtual view point to accomodate more detailed camera parameters and to be set according to the parameters of the &amp;quot;LiveCam&amp;quot;.  &lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
(3) Woo's proposal&lt;br /&gt;
&lt;br /&gt;
- Woo proposes to use XML, as meta descriptors, with existing standards (e.g. X3D, Collada, etc.) for describing the augmentation information themselves.&lt;br /&gt;
&lt;br /&gt;
- As for the context (condition) for augmentation, a clear specification of &amp;quot;5W&amp;quot; approach is proposed: namely who, when, where, what and how.&lt;br /&gt;
&lt;br /&gt;
- &amp;quot;who&amp;quot; part specifies the owner/author of the contents.&lt;br /&gt;
&lt;br /&gt;
- &amp;quot;when&amp;quot; part specifies content creation time.&lt;br /&gt;
&lt;br /&gt;
- &amp;quot;where&amp;quot; part specifies the location of the physical object to which an augmentation is attached.&lt;br /&gt;
&lt;br /&gt;
- &amp;quot;what&amp;quot; part specifies the what is to be augmented content (augmentation information).&lt;br /&gt;
&lt;br /&gt;
- &amp;quot;how&amp;quot; part specifies dynamic part (behavior) of the content. &lt;br /&gt;
&lt;br /&gt;
* [http://dxp.korea.ac.kr/AR_standards/AR_standards.zip A zipped files containing various Korean proposals].&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
== X3D Earth Working Group ==&lt;br /&gt;
&lt;br /&gt;
The X3D Earth Working Group has expanded and refined a proposal by Dr. Myeong Won Lee for a new &lt;br /&gt;
[http://www.web3d.org/membership/login/memberwiki/index.php/X3D_v3.3_Specification_Changes#GpsSensor_node GpsSensor node].&lt;br /&gt;
Due to several overlapping technical issues, the group has asked to collaborate with the Augmented Reality group on final design for this node.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
= Participants =&lt;br /&gt;
&lt;br /&gt;
* Anita Havele&lt;br /&gt;
* Damon Hernandez&lt;br /&gt;
* Don Brutzman&lt;br /&gt;
* Gerard J. Kim&lt;br /&gt;
* Gun Lee&lt;br /&gt;
* Len Daly, Daly Realism&lt;br /&gt;
* Timo Engelke&lt;br /&gt;
* Sabine Webel&lt;br /&gt;
* Yvonne Jung&lt;br /&gt;
&lt;br /&gt;
= References =&lt;br /&gt;
&lt;br /&gt;
TODO&lt;br /&gt;
&lt;br /&gt;
= Participation and Liaisons =&lt;br /&gt;
&lt;br /&gt;
* TODO describe Christine Perry's group on AR Standardization&lt;br /&gt;
&lt;br /&gt;
* Other partnerships can also be considered as appropriate.&lt;br /&gt;
&lt;br /&gt;
Of interest is that Web3D Consortium Intellectual Property Rights (IPR) Policy insists on open, royalty free (RF) specifications.  These coexist effectively with the Web Architecture and many different business models.&lt;/div&gt;</summary>
		<author><name>Yjung</name></author>	</entry>

	<entry>
		<id>https://old.web3d.org/wiki/index.php?title=File:Kaiser140.png&amp;diff=3752</id>
		<title>File:Kaiser140.png</title>
		<link rel="alternate" type="text/html" href="https://old.web3d.org/wiki/index.php?title=File:Kaiser140.png&amp;diff=3752"/>
				<updated>2011-05-22T18:43:49Z</updated>
		
		<summary type="html">&lt;p&gt;Yjung: Occlusions and Shadows in MR.&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;Occlusions and Shadows in MR.&lt;/div&gt;</summary>
		<author><name>Yjung</name></author>	</entry>

	<entry>
		<id>https://old.web3d.org/wiki/index.php?title=X3D_and_Augmented_Reality&amp;diff=3750</id>
		<title>X3D and Augmented Reality</title>
		<link rel="alternate" type="text/html" href="https://old.web3d.org/wiki/index.php?title=X3D_and_Augmented_Reality&amp;diff=3750"/>
				<updated>2011-05-22T18:40:28Z</updated>
		
		<summary type="html">&lt;p&gt;Yjung: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;== History and Background Information ==&lt;br /&gt;
Web3D Consortium formed a special interest group on AR initiatives in July 2009 worked to help create the AR Technology Road Map.&lt;br /&gt;
&lt;br /&gt;
Several Web3D Consortium member projects showcase the feasibility of AR in X3D, particularly [http://www.x3dom.org X3DOM] open source produced by [http://www.igd.fhg.de/www/igd-a4 Fraunhofer IGD]. &lt;br /&gt;
&lt;br /&gt;
* X3DOM can serve as an out of the box, standards-based solution for AR developers.&lt;br /&gt;
&lt;br /&gt;
* X3D and X3DOM continue to improve scripting options for X3D with HTML5 and this has good promise for AR applications. &lt;br /&gt;
&lt;br /&gt;
* X3DOM is being considered for potential standardization in a Mobile and/or Augmented Reality (AR) profile for X3D.&lt;br /&gt;
&lt;br /&gt;
Our [http://www.web3d.kr Web3D Korea Chapter members] from ETRI are working on Mixed Reality visualization in X3D.&lt;br /&gt;
&lt;br /&gt;
* The Consortium as been working closely within W3C HTML5 WG to align our standards for 3D visualization on the Web. &lt;br /&gt;
&lt;br /&gt;
Additional details are available at: &lt;br /&gt;
&lt;br /&gt;
* [http://www.web3d.org/x3d/wiki/index.php/X3D_and_HTML5 X3D and HTML5]&lt;br /&gt;
&lt;br /&gt;
* [http://www.web3d.org/x3d/wiki/index.php/X3D_and_HTML_FAQ X3D and HTML FAQ]&lt;br /&gt;
&lt;br /&gt;
* [http://www.x3dom.org X3DOM]&lt;br /&gt;
&lt;br /&gt;
Web3D Consortium is also engaged with the [http://www.arstandards.org AR standards], [http://opengeospacial.org  OGC], [http://www.khronos.org Khronos] and [http://w3c.org W3C] organizations for applying and adapting X3D.&lt;br /&gt;
&lt;br /&gt;
* Web3D Consortium will be starting an AR working group in March 2011 to develop a roadmap for AR standardization. We encourage your participation. &lt;br /&gt;
&lt;br /&gt;
* The technology discussions and meetings will be conducted publicly so that the widest possible comment and reaction can be considered from the Community. Feedback from this community will help X3D quickly and stably adopt new technologies to provide ongoing archival value for all 3D graphics.&lt;br /&gt;
&lt;br /&gt;
* Meanwhile Web3D Members still retain important membership rights of proposing significant new technology and considering patented technologies within a &amp;quot;safe haven&amp;quot; prior to public release.&lt;br /&gt;
&lt;br /&gt;
[http://www.web3d.org/x3d/wiki/images/3/32/X3dProfilePossibilitiesMobileHtml5AR.2010June29.pdf X3D Mobile Profile slideset]&lt;br /&gt;
from last summer's Mobile X3D ISO Workshop has also been&lt;br /&gt;
linked: how Mobile, HTML5 and possibly Augmented Reality (AR) components&lt;br /&gt;
can be aligned together.&lt;br /&gt;
&lt;br /&gt;
Many new Web3D capabilities are becoming available.  There has been no better time to discuss X3D technologies and also join the [http://web3d.org/membersip/join Web3D Consortium]. Get involved in these early discussions to create the AR Technology Road Map. We look forward to your participation.&lt;br /&gt;
&lt;br /&gt;
= Calendar =&lt;br /&gt;
&lt;br /&gt;
[http://www.perey.com/ARStandards/third-international-ar-standards-meeting/ Third International AR Standards Meeting, June 15-17, Taichung, Taiwan]&lt;br /&gt;
&lt;br /&gt;
[http://web3d2011.org/ Augmented/Mixed Reality Workshop at Web3D Conference, 10:50AM-12:10PM Local Time,  June 21, Paris, France]&lt;br /&gt;
&lt;br /&gt;
[http://www.siggraph.org/s2011/for_attendees/birds-feather Augmented and Mixed Reality Web3D BOF at SIGGRAPH 2011, August 2011, Vancouver, Canada]&lt;br /&gt;
&lt;br /&gt;
SC24 Augmented and Mixed Reality Study Group Meeting  @ SC24 Plenary and Working Group Meetings, August 21 2011, Rapid City, South Dakota, USA&lt;br /&gt;
&lt;br /&gt;
= Draft Charter =&lt;br /&gt;
&lt;br /&gt;
Adding points to this section will be the subject of the next meetings.&lt;br /&gt;
&lt;br /&gt;
== Overview ==&lt;br /&gt;
The Augmented Reality (AR) Working Group focuses on utilizing and extending X3D capabilities to support augmented reality (AR) and mixed reality (MR) applications.&lt;br /&gt;
&lt;br /&gt;
''Discussion.''  These X3D AR discussions were initially held as part of a special interest group. Now that we have determined sufficient exists to modify the X3D Specification, this effort is moving forward by forming the X3D AR Working Group.&lt;br /&gt;
&lt;br /&gt;
== Goals ==&lt;br /&gt;
Planned goals of AR WG include:&lt;br /&gt;
* Collect requirements and describe typical use cases for using X3D in AR/MR applications&lt;br /&gt;
* Develop and extend functions and nodes for X3D specification required to support AR/MR applications&lt;br /&gt;
* Produce sample AR/MR applications using X3D to demonstrate how this functionality can work correctly&lt;br /&gt;
* Produce and X3D propose Specification changes, possibly include an AR Component and/or a Mobile Profile&lt;br /&gt;
&lt;br /&gt;
== Tasks ==&lt;br /&gt;
&lt;br /&gt;
TODO these tasks should probably be rewritten as separate distinct efforts that support the Goals&lt;br /&gt;
&lt;br /&gt;
* Investigate state-of-the-art technologies related to X3D and AR/MR&lt;br /&gt;
* Develop and extend X3D specification to support AR/MR applications&lt;br /&gt;
* Promote X3D in AR/MR field by developing and demonstrating use cases&lt;br /&gt;
** videos are particularly compelling and more informative that a 3D demo because they show the use of AR/MR in the context of the real world&lt;br /&gt;
&lt;br /&gt;
== Deliverables and Timeline ==&lt;br /&gt;
* Use cases of X3D for AR/MR application - summer 2011&lt;br /&gt;
* Requirements for X3D to support AR/MR applications - summer 2011&lt;br /&gt;
* Proposed new/extended functions and nodes for X3D specification&lt;br /&gt;
** once use cases and requirements are stated, can compare existing and new proposals for X3D functionality&lt;br /&gt;
* Sample AR/MR applications with X3D&lt;br /&gt;
** these will be produced in support of each proposal&lt;br /&gt;
&lt;br /&gt;
== Meetings ==&lt;br /&gt;
Our twice-monthly teleconference for X3D and Augmented Reality is usually&lt;br /&gt;
* Together with Korea Chapter Meeting:&lt;br /&gt;
1710-1810 (Pacific time)/2010-2110 (Eastern time)  on 1st Wednesday, which is 0910-1010 (Korea time) on 1st Thursday.&lt;br /&gt;
* AR only Meeting:&lt;br /&gt;
1710-1810 (Pacific time)/2010-2110 (Eastern time)  on 3rd Wednesday, which is 0910-1010 (Korea time) on 3rd Thursday.&lt;br /&gt;
&lt;br /&gt;
Participation is open to everyone via the Web3D teleconference line.  Non-members can [mailto:anita.havele@web3d.org?subject=AR%20teleconference%20request request access information] for this call, or [http://web3d.org/membership/join Join Web3D]!&lt;br /&gt;
&lt;br /&gt;
Our next public teleconference is __________ 2011.&lt;br /&gt;
&lt;br /&gt;
Meeting agenda:&lt;br /&gt;
TBA through the mailing list.&lt;br /&gt;
&lt;br /&gt;
Discussions occur on the [mailto:x3d-public@web3d.org x3d-public@web3d.org mailing list].  If the email traffic becomes very busy then we can create a separate email list.&lt;br /&gt;
&lt;br /&gt;
Meeting minutes are also distributed on the X3D-Public mailing list and [http://web3d.org/pipermail/x3d-public_web3d.org/ archived online].&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
= Existing Proposals =&lt;br /&gt;
&lt;br /&gt;
== Instant Reality ==&lt;br /&gt;
&lt;br /&gt;
Instant Reality is a Mixed Reality framework developed and maintained by Fraunhofer IGD, which uses X3D as application description language and which thus also provides corresponding nodes and concepts for developing AR/MR applications. The full node documentation can be found on [http://doc.instantreality.org/documentation/ IR-Docs].&lt;br /&gt;
&lt;br /&gt;
There also exist several tutorials on vision-based tracking with Instant Reality, which e.g. describe specific nodes like the IOSensor node for retrieving the camera streams and the tracking results of the vision subsystem, or which discuss the new PolygonBackground node for displaying the camera images behind the virtual objects as well as some useful camera extensions to the X3D Viewpoint node, etc.: [http://doc.instantreality.org/tutorial/ Tracking-Tutorial].&lt;br /&gt;
&lt;br /&gt;
In addition, some papers on AR and MR visualization already were pubished at the Web3D conferences. Here, e.g. occlusions, shadows and lighting in MR scenes were discussed in the context of X3D:&lt;br /&gt;
* [http://www.web3d.org/x3d/wiki/images/6/63/07X3DforMR.pdf MR-Paper-2007]&lt;br /&gt;
* [http://www.web3d.org/x3d/wiki/images/0/0d/07X3DforMR_slides.pdf MR-Paper-Slides]&lt;br /&gt;
* [http://www.web3d.org/x3d/wiki/images/0/0a/08PRTforX3D.pdf MR-Paper-2008]&lt;br /&gt;
There are moreover some more ideas on realistic MR rendering in X3D outlined in chapter 6 (p. 163 ff.), and especially in section 6.4, in the following PhD thesis (by Y. Jung): [http://tuprints.ulb.tu-darmstadt.de/2489/ PDF].&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
Below, an example for achieving occlusion effects between real and virtual objects in AR/MR scenes is shown, given that the (real) 3D object, for which occlusions should be handled, already exist as 3D model (given as Shape in this example). Here, the invisible ghosting objects (denoting real gometry) are simply created by rendering them _before_ the virtual objects (by setting the Appearance node's &amp;quot;sortKey&amp;quot; field to '-1') without writing any color values to the framebuffer (via the ColorMaskMode node) to initially stamp out the depth buffer.&lt;br /&gt;
&lt;br /&gt;
  &amp;lt;Shape&amp;gt;&lt;br /&gt;
    &amp;lt;Appearance sortKey='-1'&amp;gt;&lt;br /&gt;
      &amp;lt;ColorMaskMode maskR='false' maskG='false' maskB='false' maskA='false'/&amp;gt;&lt;br /&gt;
    &amp;lt;/Appearance&amp;gt;&lt;br /&gt;
    ...&lt;br /&gt;
  &amp;lt;/Shape&amp;gt;&lt;br /&gt;
&lt;br /&gt;
To set the camera's image in the background we use the aforementioned PolygonBackground node. By setting its field &amp;quot;fixedImageSize&amp;quot; the aspect ratio of the image can be defined. Depending on how you want the background image fit into the window, you need to set the mode field to 'VERTICAL' or 'HORIZONTAL'.&lt;br /&gt;
&lt;br /&gt;
  &amp;lt;PolygonBackground fixedImageSize='640,480' mode='VERTICAL'&amp;gt;&lt;br /&gt;
    &amp;lt;Appearance&amp;gt;&lt;br /&gt;
      &amp;lt;PixelTexture2D DEF='tex'/&amp;gt;&lt;br /&gt;
    &amp;lt;/Appearance&amp;gt;&lt;br /&gt;
  &amp;lt;/PolygonBackground&amp;gt; &lt;br /&gt;
&lt;br /&gt;
As mentioned above, more on that can be found in the corresponding tutorials, e.g. [http://doc.instantreality.org/tutorial/marker-tracking/ here].&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
== Korean Chapter ==&lt;br /&gt;
&lt;br /&gt;
The Korea chapter has been keenly interested in the standardization of augmented reality in many aspects including the AR based contents.  This is especially due to the recent world-wide appearances of mobile AR services and realization (from both academia and industry) of the definite need for exchanging service contents on different platforms. &lt;br /&gt;
&lt;br /&gt;
Three main proposals have been made within the Korean chapter, by: (1) Gerard Kim from Korea University (also representing KIST), (2) Gun A. Lee (formerly with ETRI, now with HITLabNZ), and (3) Woontack Woo of Gwangju Inst. of Science and Tech.   &lt;br /&gt;
&lt;br /&gt;
Here, we briefly describe each proposal and provide links to documents with more detailed descriptions.  These short summararies also try to highlight their distinctions with regards to other proposals, but not in the critical sense, but as a way to suggest alternatives.&lt;br /&gt;
&lt;br /&gt;
(1) Gerry Kim's proposal can be highlighted by the following features:&lt;br /&gt;
&lt;br /&gt;
- Extension of existing X3D &amp;quot;sensors&amp;quot; and formalisms to represent physical objects serving as proxies for virtual objects&lt;br /&gt;
&lt;br /&gt;
- The physical objects and virtual objects are tied using the &amp;quot;routes&amp;quot; (e.g. virtual objects' parent coordinate system being set to that of the corresponding physical object).&lt;br /&gt;
&lt;br /&gt;
- Below shows an example construct which is a simple extension of the &amp;quot;VisibilitySensor&amp;quot; attached to a marker.  The rough semantic would be to attach a sphere to a marker when visible.  The visibilty would be determined by the browser using a particular tracker.  In this simple case, a simple marke description is given through the &amp;quot;marker&amp;quot; node.&lt;br /&gt;
&lt;br /&gt;
 &amp;lt;Scene&amp;gt;&lt;br /&gt;
 &amp;lt;Group&amp;gt;&lt;br /&gt;
  &amp;lt;Marker DEF = “HIRO” enable “TRUE” filename=”C:\hiro.patt”/&amp;gt;&lt;br /&gt;
  &amp;lt;VisibilitySensor DEF='Visibility' description='activate if seen' enabled=”TRUE”/&amp;gt;&lt;br /&gt;
  &amp;lt;Transform DEF='BALL'&amp;gt;&lt;br /&gt;
   &amp;lt;Shape&amp;gt;&lt;br /&gt;
    &amp;lt;Appearance&amp;gt;&lt;br /&gt;
     &amp;lt;Material/&amp;gt;&lt;br /&gt;
    &amp;lt;/Appearance&amp;gt;&lt;br /&gt;
     &amp;lt;Sphere/&amp;gt;&lt;br /&gt;
    &amp;lt;/Shape&amp;gt;&lt;br /&gt;
  &amp;lt;/Transform&amp;gt;&lt;br /&gt;
 &amp;lt;/Group&amp;gt;&lt;br /&gt;
 &amp;lt;ROUTE fromNode=’Vsibility’ fromField='visible' toNode=’BALL’ toField=’visible’ /&amp;gt; &lt;br /&gt;
 &amp;lt;/Scene&amp;gt;&lt;br /&gt;
&lt;br /&gt;
- Different types of sensors can be newly defined or old ones extended to describe various AR contents.  These include proximity sensors, range sensors, etc.&lt;br /&gt;
&lt;br /&gt;
- Different physical object description will be needed at the right level of abstraction (such as the &amp;quot;marker&amp;quot; node in the above example).  These include those for image patch, 3D object, GPS location, natural features (e.g. points, lines), and etc.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
(2) Gun Lee's proposal&lt;br /&gt;
&lt;br /&gt;
- Extension of the TextureBackground and Movie texture node to handle video background for video see-through AR implementation.&lt;br /&gt;
&lt;br /&gt;
- Introduction of a node called &amp;quot;LiveCam&amp;quot; representing the video capture or vision based sensing in a video see-through AR implementation.&lt;br /&gt;
&lt;br /&gt;
- The video background would be routed from the &amp;quot;LiveCam&amp;quot; node and be supplied with the video image and/or camera parameters.&lt;br /&gt;
&lt;br /&gt;
- Extension of the virtual view point to accomodate more detailed camera parameters and to be set according to the parameters of the &amp;quot;LiveCam&amp;quot;.  &lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
(3) Woo's proposal&lt;br /&gt;
&lt;br /&gt;
- Woo proposes to use XML, as meta descriptors, with existing standards (e.g. X3D, Collada, etc.) for describing the augmentation information themselves.&lt;br /&gt;
&lt;br /&gt;
- As for the context (condition) for augmentation, a clear specification of &amp;quot;5W&amp;quot; approach is proposed: namely who, when, where, what and how.&lt;br /&gt;
&lt;br /&gt;
- &amp;quot;who&amp;quot; part specifies the owner/author of the contents.&lt;br /&gt;
&lt;br /&gt;
- &amp;quot;when&amp;quot; part specifies content creation time.&lt;br /&gt;
&lt;br /&gt;
- &amp;quot;where&amp;quot; part specifies the location of the physical object to which an augmentation is attached.&lt;br /&gt;
&lt;br /&gt;
- &amp;quot;what&amp;quot; part specifies the what is to be augmented content (augmentation information).&lt;br /&gt;
&lt;br /&gt;
- &amp;quot;how&amp;quot; part specifies dynamic part (behavior) of the content. &lt;br /&gt;
&lt;br /&gt;
* [http://dxp.korea.ac.kr/AR_standards/AR_standards.zip A zipped files containing various Korean proposals].&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
== X3D Earth Working Group ==&lt;br /&gt;
&lt;br /&gt;
The X3D Earth Working Group has expanded and refined a proposal by Dr. Myeong Won Lee for a new &lt;br /&gt;
[http://www.web3d.org/membership/login/memberwiki/index.php/X3D_v3.3_Specification_Changes#GpsSensor_node GpsSensor node].&lt;br /&gt;
Due to several overlapping technical issues, the group has asked to collaborate with the Augmented Reality group on final design for this node.&lt;br /&gt;
&lt;br /&gt;
= Participants =&lt;br /&gt;
&lt;br /&gt;
* Anita Havele&lt;br /&gt;
* Damon Hernandez&lt;br /&gt;
* Don Brutzman&lt;br /&gt;
* Gerard J. Kim&lt;br /&gt;
* Gun Lee&lt;br /&gt;
* Len Daly, Daly Realism&lt;br /&gt;
* Timo Engelke&lt;br /&gt;
* Sabine Webel&lt;br /&gt;
* Yvonne Jung&lt;br /&gt;
&lt;br /&gt;
= References =&lt;br /&gt;
&lt;br /&gt;
TODO&lt;br /&gt;
&lt;br /&gt;
= Participation and Liaisons =&lt;br /&gt;
&lt;br /&gt;
* TODO describe Christine Perry's group on AR Standardization&lt;br /&gt;
&lt;br /&gt;
* Other partnerships can also be considered as appropriate.&lt;br /&gt;
&lt;br /&gt;
Of interest is that Web3D Consortium Intellectual Property Rights (IPR) Policy insists on open, royalty free (RF) specifications.  These coexist effectively with the Web Architecture and many different business models.&lt;/div&gt;</summary>
		<author><name>Yjung</name></author>	</entry>

	<entry>
		<id>https://old.web3d.org/wiki/index.php?title=X3D_and_Augmented_Reality&amp;diff=3749</id>
		<title>X3D and Augmented Reality</title>
		<link rel="alternate" type="text/html" href="https://old.web3d.org/wiki/index.php?title=X3D_and_Augmented_Reality&amp;diff=3749"/>
				<updated>2011-05-22T16:31:50Z</updated>
		
		<summary type="html">&lt;p&gt;Yjung: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;== History and Background Information ==&lt;br /&gt;
Web3D Consortium formed a special interest group on AR initiatives in July 2009 worked to help create the AR Technology Road Map.&lt;br /&gt;
&lt;br /&gt;
Several Web3D Consortium member projects showcase the feasibility of AR in X3D, particularly [http://www.x3dom.org X3DOM] open source produced by [http://www.igd.fhg.de/www/igd-a4 Fraunhofer IGD]. &lt;br /&gt;
&lt;br /&gt;
* X3DOM can serve as an out of the box, standards-based solution for AR developers.&lt;br /&gt;
&lt;br /&gt;
* X3D and X3DOM continue to improve scripting options for X3D with HTML5 and this has good promise for AR applications. &lt;br /&gt;
&lt;br /&gt;
* X3DOM is being considered for potential standardization in a Mobile and/or Augmented Reality (AR) profile for X3D.&lt;br /&gt;
&lt;br /&gt;
Our [http://www.web3d.kr Web3D Korea Chapter members] from ETRI are working on Mixed Reality visualization in X3D.&lt;br /&gt;
&lt;br /&gt;
* The Consortium as been working closely within W3C HTML5 WG to align our standards for 3D visualization on the Web. &lt;br /&gt;
&lt;br /&gt;
Additional details are available at: &lt;br /&gt;
&lt;br /&gt;
* [http://www.web3d.org/x3d/wiki/index.php/X3D_and_HTML5 X3D and HTML5]&lt;br /&gt;
&lt;br /&gt;
* [http://www.web3d.org/x3d/wiki/index.php/X3D_and_HTML_FAQ X3D and HTML FAQ]&lt;br /&gt;
&lt;br /&gt;
* [http://www.x3dom.org X3DOM]&lt;br /&gt;
&lt;br /&gt;
Web3D Consortium is also engaged with the [http://www.arstandards.org AR standards], [http://opengeospacial.org  OGC], [http://www.khronos.org Khronos] and [http://w3c.org W3C] organizations for applying and adapting X3D.&lt;br /&gt;
&lt;br /&gt;
* Web3D Consortium will be starting an AR working group in March 2011 to develop a roadmap for AR standardization. We encourage your participation. &lt;br /&gt;
&lt;br /&gt;
* The technology discussions and meetings will be conducted publicly so that the widest possible comment and reaction can be considered from the Community. Feedback from this community will help X3D quickly and stably adopt new technologies to provide ongoing archival value for all 3D graphics.&lt;br /&gt;
&lt;br /&gt;
* Meanwhile Web3D Members still retain important membership rights of proposing significant new technology and considering patented technologies within a &amp;quot;safe haven&amp;quot; prior to public release.&lt;br /&gt;
&lt;br /&gt;
[http://www.web3d.org/x3d/wiki/images/3/32/X3dProfilePossibilitiesMobileHtml5AR.2010June29.pdf X3D Mobile Profile slideset]&lt;br /&gt;
from last summer's Mobile X3D ISO Workshop has also been&lt;br /&gt;
linked: how Mobile, HTML5 and possibly Augmented Reality (AR) components&lt;br /&gt;
can be aligned together.&lt;br /&gt;
&lt;br /&gt;
Many new Web3D capabilities are becoming available.  There has been no better time to discuss X3D technologies and also join the [http://web3d.org/membersip/join Web3D Consortium]. Get involved in these early discussions to create the AR Technology Road Map. We look forward to your participation.&lt;br /&gt;
&lt;br /&gt;
= Calendar =&lt;br /&gt;
&lt;br /&gt;
[http://www.perey.com/ARStandards/third-international-ar-standards-meeting/ Third International AR Standards Meeting, June 15-17, Taichung, Taiwan]&lt;br /&gt;
&lt;br /&gt;
[http://web3d2011.org/ Augmented/Mixed Reality Workshop at Web3D Conference, 10:50AM-12:10PM Local Time,  June 21, Paris, France]&lt;br /&gt;
&lt;br /&gt;
[http://www.siggraph.org/s2011/for_attendees/birds-feather Augmented and Mixed Reality Web3D BOF at SIGGRAPH 2011, August 2011, Vancouver, Canada]&lt;br /&gt;
&lt;br /&gt;
SC24 Augmented and Mixed Reality Study Group Meeting  @ SC24 Plenary and Working Group Meetings, August 21 2011, Rapid City, South Dakota, USA&lt;br /&gt;
&lt;br /&gt;
= Draft Charter =&lt;br /&gt;
&lt;br /&gt;
Adding points to this section will be the subject of the next meetings.&lt;br /&gt;
&lt;br /&gt;
== Overview ==&lt;br /&gt;
The Augmented Reality (AR) Working Group focuses on utilizing and extending X3D capabilities to support augmented reality (AR) and mixed reality (MR) applications.&lt;br /&gt;
&lt;br /&gt;
''Discussion.''  These X3D AR discussions were initially held as part of a special interest group. Now that we have determined sufficient exists to modify the X3D Specification, this effort is moving forward by forming the X3D AR Working Group.&lt;br /&gt;
&lt;br /&gt;
== Goals ==&lt;br /&gt;
Planned goals of AR WG include:&lt;br /&gt;
* Collect requirements and describe typical use cases for using X3D in AR/MR applications&lt;br /&gt;
* Develop and extend functions and nodes for X3D specification required to support AR/MR applications&lt;br /&gt;
* Produce sample AR/MR applications using X3D to demonstrate how this functionality can work correctly&lt;br /&gt;
* Produce and X3D propose Specification changes, possibly include an AR Component and/or a Mobile Profile&lt;br /&gt;
&lt;br /&gt;
== Tasks ==&lt;br /&gt;
&lt;br /&gt;
TODO these tasks should probably be rewritten as separate distinct efforts that support the Goals&lt;br /&gt;
&lt;br /&gt;
* Investigate state-of-the-art technologies related to X3D and AR/MR&lt;br /&gt;
* Develop and extend X3D specification to support AR/MR applications&lt;br /&gt;
* Promote X3D in AR/MR field by developing and demonstrating use cases&lt;br /&gt;
** videos are particularly compelling and more informative that a 3D demo because they show the use of AR/MR in the context of the real world&lt;br /&gt;
&lt;br /&gt;
== Deliverables and Timeline ==&lt;br /&gt;
* Use cases of X3D for AR/MR application - summer 2011&lt;br /&gt;
* Requirements for X3D to support AR/MR applications - summer 2011&lt;br /&gt;
* Proposed new/extended functions and nodes for X3D specification&lt;br /&gt;
** once use cases and requirements are stated, can compare existing and new proposals for X3D functionality&lt;br /&gt;
* Sample AR/MR applications with X3D&lt;br /&gt;
** these will be produced in support of each proposal&lt;br /&gt;
&lt;br /&gt;
== Meetings ==&lt;br /&gt;
Our twice-monthly teleconference for X3D and Augmented Reality is usually&lt;br /&gt;
* Together with Korea Chapter Meeting:&lt;br /&gt;
1710-1810 (Pacific time)/2010-2110 (Eastern time)  on 1st Wednesday, which is 0910-1010 (Korea time) on 1st Thursday.&lt;br /&gt;
* AR only Meeting:&lt;br /&gt;
1710-1810 (Pacific time)/2010-2110 (Eastern time)  on 3rd Wednesday, which is 0910-1010 (Korea time) on 3rd Thursday.&lt;br /&gt;
&lt;br /&gt;
Participation is open to everyone via the Web3D teleconference line.  Non-members can [mailto:anita.havele@web3d.org?subject=AR%20teleconference%20request request access information] for this call, or [http://web3d.org/membership/join Join Web3D]!&lt;br /&gt;
&lt;br /&gt;
Our next public teleconference is __________ 2011.&lt;br /&gt;
&lt;br /&gt;
Meeting agenda:&lt;br /&gt;
TBA through the mailing list.&lt;br /&gt;
&lt;br /&gt;
Discussions occur on the [mailto:x3d-public@web3d.org x3d-public@web3d.org mailing list].  If the email traffic becomes very busy then we can create a separate email list.&lt;br /&gt;
&lt;br /&gt;
Meeting minutes are also distributed on the X3D-Public mailing list and [http://web3d.org/pipermail/x3d-public_web3d.org/ archived online].&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
= Existing Proposals =&lt;br /&gt;
&lt;br /&gt;
== Instant Reality ==&lt;br /&gt;
&lt;br /&gt;
Instant Reality is a Mixed Reality framework developed and maintained by Fraunhofer IGD, which uses X3D as application description language and which thus also provides corresponding nodes and concepts for developing AR/MR applications. The full node documentation can be found on [http://doc.instantreality.org/documentation/ IR-Docs].&lt;br /&gt;
&lt;br /&gt;
There also exist several tutorials on vision-based tracking with Instant Reality, which e.g. describe specific nodes like the IOSensor node for retrieving the camera streams and the tracking results of the vision subsystem, or which discuss the new PolygonBackground node for displaying the camera images behind the virtual objects as well as some useful camera extensions to the X3D Viewpoint node, etc.: [http://doc.instantreality.org/tutorial/ Tracking-Tutorial].&lt;br /&gt;
&lt;br /&gt;
In addition, some papers on AR and MR visualization already were pubished at the Web3D conferences. Here, e.g. occlusions, shadows and lighting in MR scenes were discussed in the context of X3D:&lt;br /&gt;
* [http://www.web3d.org/x3d/wiki/images/6/63/07X3DforMR.pdf MR-Paper-2007]&lt;br /&gt;
* [http://www.web3d.org/x3d/wiki/images/0/0d/07X3DforMR_slides.pdf MR-Paper-Slides]&lt;br /&gt;
* [http://www.web3d.org/x3d/wiki/images/0/0a/08PRTforX3D.pdf MR-Paper-2008]&lt;br /&gt;
There are moreover some more ideas on realistic MR rendering in X3D outlined in chapter 6 (p. 163 ff.), and especially in section 6.4, in the following PhD thesis (by Y. Jung): [http://tuprints.ulb.tu-darmstadt.de/2489/ PDF].&lt;br /&gt;
&lt;br /&gt;
Below, an example for achieving occlusion effects between real and virtual objects in AR/MR scenes is shown, given that the (real) 3D object, for which occlusions should be handled, already exist as 3D model (given as Shape in this example). Here, the invisible ghosting objects (denoting real gometry) are simply created by rendering them _before_ the virtual objects (by setting the Appearance node's &amp;quot;sortKey&amp;quot; field to '-1') without writing any color values to the framebuffer (via the ColorMaskMode node) to initially stamp out the depth buffer.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;Shape&amp;gt;&lt;br /&gt;
    &amp;lt;Appearance sortKey='-1'&amp;gt;&lt;br /&gt;
        &amp;lt;ColorMaskMode maskR='false' maskG='false' maskB='false' maskA='false'/&amp;gt;&lt;br /&gt;
    &amp;lt;/Appearance&amp;gt;&lt;br /&gt;
    ...&lt;br /&gt;
&amp;lt;/Shape&amp;gt;&lt;br /&gt;
&lt;br /&gt;
To set the camera's image in the background we use the aforementioned PolygonBackground node. By setting its field &amp;quot;fixedImageSize&amp;quot; the aspect ratio of the image can be defined. Depending on how you want the background image fit into the window, you need to set the mode field to 'VERTICAL' or 'HORIZONTAL'.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;PolygonBackground fixedImageSize='640,480' mode='VERTICAL'&amp;gt;&lt;br /&gt;
    &amp;lt;Appearance&amp;gt;&lt;br /&gt;
        &amp;lt;PixelTexture2D DEF='tex'/&amp;gt;&lt;br /&gt;
    &amp;lt;/Appearance&amp;gt;&lt;br /&gt;
&amp;lt;/PolygonBackground&amp;gt; &lt;br /&gt;
&lt;br /&gt;
As mentioned above, more on that can be found in the corresponding tutorials, e.g. [http://doc.instantreality.org/tutorial/marker-tracking/ here].&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
== Korean Chapter ==&lt;br /&gt;
&lt;br /&gt;
The Korea chapter has been keenly interested in the standardization of augmented reality in many aspects including the AR based contents.  This is especially due to the recent world-wide appearances of mobile AR services and realization (from both academia and industry) of the definite need for exchanging service contents on different platforms. &lt;br /&gt;
&lt;br /&gt;
Three main proposals have been made within the Korean chapter, by: (1) Gerard Kim from Korea University (also representing KIST), (2) Gun A. Lee (formerly with ETRI, now with HITLabNZ), and (3) Woontack Woo of Gwangju Inst. of Science and Tech.   &lt;br /&gt;
&lt;br /&gt;
Here, we briefly describe each proposal and provide links to documents with more detailed descriptions.  These short summararies also try to highlight their distinctions with regards to other proposals, but not in the critical sense, but as a way to suggest alternatives.&lt;br /&gt;
&lt;br /&gt;
(1) Gerry Kim's proposal can be highlighted by the following features:&lt;br /&gt;
&lt;br /&gt;
- Extension of existing X3D &amp;quot;sensors&amp;quot; and formalisms to represent physical objects serving as proxies for virtual objects&lt;br /&gt;
&lt;br /&gt;
- The physical objects and virtual objects are tied using the &amp;quot;routes&amp;quot; (e.g. virtual objects' parent coordinate system being set to that of the corresponding physical object).&lt;br /&gt;
&lt;br /&gt;
- Below shows an example construct which is a simple extension of the &amp;quot;VisibilitySensor&amp;quot; attached to a marker.  The rough semantic would be to attach a sphere to a marker when visible.  The visibilty would be determined by the browser using a particular tracker.  In this simple case, a simple marke description is given through the &amp;quot;marker&amp;quot; node.&lt;br /&gt;
&lt;br /&gt;
 &amp;lt;Scene&amp;gt;&lt;br /&gt;
 &amp;lt;Group&amp;gt;&lt;br /&gt;
  &amp;lt;Marker DEF = “HIRO” enable “TRUE” filename=”C:\hiro.patt”/&amp;gt;&lt;br /&gt;
  &amp;lt;VisibilitySensor DEF='Visibility' description='activate if seen' enabled=”TRUE”/&amp;gt;&lt;br /&gt;
  &amp;lt;Transform DEF='BALL'&amp;gt;&lt;br /&gt;
   &amp;lt;Shape&amp;gt;&lt;br /&gt;
    &amp;lt;Appearance&amp;gt;&lt;br /&gt;
     &amp;lt;Material/&amp;gt;&lt;br /&gt;
    &amp;lt;/Appearance&amp;gt;&lt;br /&gt;
     &amp;lt;Sphere/&amp;gt;&lt;br /&gt;
    &amp;lt;/Shape&amp;gt;&lt;br /&gt;
  &amp;lt;/Transform&amp;gt;&lt;br /&gt;
 &amp;lt;/Group&amp;gt;&lt;br /&gt;
 &amp;lt;ROUTE fromNode=’Vsibility’ fromField='visible' toNode=’BALL’ toField=’visible’ /&amp;gt; &lt;br /&gt;
 &amp;lt;/Scene&amp;gt;&lt;br /&gt;
&lt;br /&gt;
- Different types of sensors can be newly defined or old ones extended to describe various AR contents.  These include proximity sensors, range sensors, etc.&lt;br /&gt;
&lt;br /&gt;
- Different physical object description will be needed at the right level of abstraction (such as the &amp;quot;marker&amp;quot; node in the above example).  These include those for image patch, 3D object, GPS location, natural features (e.g. points, lines), and etc.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
(2) Gun Lee's proposal&lt;br /&gt;
&lt;br /&gt;
- Extension of the TextureBackground and Movie texture node to handle video background for video see-through AR implementation.&lt;br /&gt;
&lt;br /&gt;
- Introduction of a node called &amp;quot;LiveCam&amp;quot; representing the video capture or vision based sensing in a video see-through AR implementation.&lt;br /&gt;
&lt;br /&gt;
- The video background would be routed from the &amp;quot;LiveCam&amp;quot; node and be supplied with the video image and/or camera parameters.&lt;br /&gt;
&lt;br /&gt;
- Extension of the virtual view point to accomodate more detailed camera parameters and to be set according to the parameters of the &amp;quot;LiveCam&amp;quot;.  &lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
(3) Woo's proposal&lt;br /&gt;
&lt;br /&gt;
- Woo proposes to use XML, as meta descriptors, with existing standards (e.g. X3D, Collada, etc.) for describing the augmentation information themselves.&lt;br /&gt;
&lt;br /&gt;
- As for the context (condition) for augmentation, a clear specification of &amp;quot;5W&amp;quot; approach is proposed: namely who, when, where, what and how.&lt;br /&gt;
&lt;br /&gt;
- &amp;quot;who&amp;quot; part specifies the owner/author of the contents.&lt;br /&gt;
&lt;br /&gt;
- &amp;quot;when&amp;quot; part specifies content creation time.&lt;br /&gt;
&lt;br /&gt;
- &amp;quot;where&amp;quot; part specifies the location of the physical object to which an augmentation is attached.&lt;br /&gt;
&lt;br /&gt;
- &amp;quot;what&amp;quot; part specifies the what is to be augmented content (augmentation information).&lt;br /&gt;
&lt;br /&gt;
- &amp;quot;how&amp;quot; part specifies dynamic part (behavior) of the content. &lt;br /&gt;
&lt;br /&gt;
* [http://dxp.korea.ac.kr/AR_standards/AR_standards.zip A zipped files containing various Korean proposals].&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
== X3D Earth Working Group ==&lt;br /&gt;
&lt;br /&gt;
The X3D Earth Working Group has expanded and refined a proposal by Dr. Myeong Won Lee for a new &lt;br /&gt;
[http://www.web3d.org/membership/login/memberwiki/index.php/X3D_v3.3_Specification_Changes#GpsSensor_node GpsSensor node].&lt;br /&gt;
Due to several overlapping technical issues, the group has asked to collaborate with the Augmented Reality group on final design for this node.&lt;br /&gt;
&lt;br /&gt;
= Participants =&lt;br /&gt;
&lt;br /&gt;
* Anita Havele&lt;br /&gt;
* Damon Hernandez&lt;br /&gt;
* Don Brutzman&lt;br /&gt;
* Gerard J. Kim&lt;br /&gt;
* Gun Lee&lt;br /&gt;
* Len Daly, Daly Realism&lt;br /&gt;
* Timo Engelke&lt;br /&gt;
* Sabine Webel&lt;br /&gt;
* Yvonne Jung&lt;br /&gt;
&lt;br /&gt;
= References =&lt;br /&gt;
&lt;br /&gt;
TODO&lt;br /&gt;
&lt;br /&gt;
= Participation and Liaisons =&lt;br /&gt;
&lt;br /&gt;
* TODO describe Christine Perry's group on AR Standardization&lt;br /&gt;
&lt;br /&gt;
* Other partnerships can also be considered as appropriate.&lt;br /&gt;
&lt;br /&gt;
Of interest is that Web3D Consortium Intellectual Property Rights (IPR) Policy insists on open, royalty free (RF) specifications.  These coexist effectively with the Web Architecture and many different business models.&lt;/div&gt;</summary>
		<author><name>Yjung</name></author>	</entry>

	<entry>
		<id>https://old.web3d.org/wiki/index.php?title=X3D_and_Augmented_Reality&amp;diff=3257</id>
		<title>X3D and Augmented Reality</title>
		<link rel="alternate" type="text/html" href="https://old.web3d.org/wiki/index.php?title=X3D_and_Augmented_Reality&amp;diff=3257"/>
				<updated>2011-03-21T10:16:01Z</updated>
		
		<summary type="html">&lt;p&gt;Yjung: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;== History ==&lt;br /&gt;
Web3D Consortium has a special interest in the recent AR initiatives and creating the AR Technology Road Map.&lt;br /&gt;
&lt;br /&gt;
We have several member projects showcasing the feasibility of AR in X3D, particularly [http://www.x3dom.org X3DOM] open source produced by [http://www.igd.fhg.de/www/igd-a4 Fraunhofer IGD]. &lt;br /&gt;
&lt;br /&gt;
* X3DOM can serve as an out of the box, standards-based solution for AR developers.&lt;br /&gt;
&lt;br /&gt;
* X3D and X3DOM continue to improve scripting options for X3D with HTML5 and this has good promise for AR applications. &lt;br /&gt;
&lt;br /&gt;
* X3DOM is being considered for potential standardization in a Mobile and/or Augmented Reality (AR) profile for X3D.&lt;br /&gt;
&lt;br /&gt;
Our [http://www.web3d.kr Web3D Korea Chapter members] from ETRI are working on Mixed Reality visualization in X3D.&lt;br /&gt;
&lt;br /&gt;
* The Consortium as been working closely within W3C HTML5 WG to align our standards for 3D visualization on the Web. &lt;br /&gt;
&lt;br /&gt;
Additional details are available at: &lt;br /&gt;
&lt;br /&gt;
* [http://www.web3d.org/x3d/wiki/index.php/X3D_and_HTML5 X3D and HTML5]&lt;br /&gt;
&lt;br /&gt;
* [http://www.web3d.org/x3d/wiki/index.php/X3D_and_HTML_FAQ X3D and HTML FAQ]&lt;br /&gt;
&lt;br /&gt;
* [http://www.x3dom.org X3DOM]&lt;br /&gt;
&lt;br /&gt;
Web3D Consortium is also engaged with the [http://www.arstandards.org AR standards], [http://opengeospacial.org  OGC], [http://www.khronos.org Khronos] and [http://w3c.org W3C] organizations for applying and adapting X3D.&lt;br /&gt;
&lt;br /&gt;
* Web3D Consortium will be starting an AR working group in March 2011 to develop a roadmap for AR standardization. We encourage your participation. &lt;br /&gt;
&lt;br /&gt;
* The technology discussions and meetings will be conducted publicly so that the widest possible comment and reaction can be considered from the Community. Feedback from this community will help X3D quickly and stably adopt new technologies to provide ongoing archival value for all 3D graphics.&lt;br /&gt;
&lt;br /&gt;
* Meanwhile Web3D Members still retain important membership rights of proposing significant new technology and considering patented technologies within a &amp;quot;safe haven&amp;quot; prior to public release.&lt;br /&gt;
&lt;br /&gt;
[http://www.web3d.org/x3d/wiki/images/3/32/X3dProfilePossibilitiesMobileHtml5AR.2010June29.pdf Slideset summary]&lt;br /&gt;
from last summer's Mobile X3D ISO Workshop has also been&lt;br /&gt;
linked: how Mobile, HTML5 and possibly Augmented Reality (AR) components&lt;br /&gt;
might be alignable together as an X3D Mobile Profile.&lt;br /&gt;
&lt;br /&gt;
Many new Web3D capabilities are becoming available.  There has been no better time to discuss X3D technologies and also join the [http://web3d.org/membersip/join Web3D Consortium]. Get involved in these early discussions to create the AR Technology Road Map. We look forward to your participation.&lt;br /&gt;
&lt;br /&gt;
= Draft Charter =&lt;br /&gt;
&lt;br /&gt;
Adding points to this section will be the subject of the next meetings.&lt;br /&gt;
&lt;br /&gt;
== Overview ==&lt;br /&gt;
The Augmented Reality Working/Special Interest Group focuses on utilizing/extending X3D functions to support augmented and mixed reality applications.&lt;br /&gt;
&lt;br /&gt;
== Discussions ==&lt;br /&gt;
These X3D AR discussion are part of a special interest group. This effort is moving forward by forming the X3D AR Working Group. The charter, roles and responsibilities are being defined.&lt;br /&gt;
&lt;br /&gt;
== Goals ==&lt;br /&gt;
Planned goals of AR WG/SIG include:&lt;br /&gt;
* Collect requirements and describe typical use cases of using X3D for AR/MR application.&lt;br /&gt;
* Develop and extend functions and nodes for X3D specification required to support AR/MR&lt;br /&gt;
* Produce sample AR/MR applications with X3D&lt;br /&gt;
&lt;br /&gt;
== Tasks ==&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
== Deliverables and Timeline ==&lt;br /&gt;
* Use cases of X3D for AR/MR application - May, 2011 TBD&lt;br /&gt;
* Requirements for X3D to support AR/MR applications - May, 2011 TBD&lt;br /&gt;
* Proposed new/extended functions and nodes for X3D specification - TBD&lt;br /&gt;
* Sample AR/MR applications with X3D - TBD&lt;br /&gt;
&lt;br /&gt;
== Meetings ==&lt;br /&gt;
&lt;br /&gt;
Our twice-monthly teleconference for X3D and Augmented Reality is usually&lt;br /&gt;
* Together with Korea Chapter Meeting:&lt;br /&gt;
1710-1810 (Pacific time)/2010-2110 (Eastern time)  on 1st Wednesday, which is 0910-1010 (Korea time) on 1st Thursday.&lt;br /&gt;
* AR only Meeting:&lt;br /&gt;
1710-1810 (Pacific time)/2010-2110 (Eastern time)  on 3rd Tuesday, which is 0910-1010 (Korea time) on 3rd Wednesday.&lt;br /&gt;
&lt;br /&gt;
Participation is open to everyone via the Web3D teleconference line.  Non-members can [mailto:anita.havele@web3d.org?subject=AR%20teleconference%20request request access information] for this call, or [http://web3d.org/membership/join Join Web3D]!&lt;br /&gt;
&lt;br /&gt;
Our next public teleconference is 15/16 March 2011.&lt;br /&gt;
&lt;br /&gt;
Meeting agenda:&lt;br /&gt;
TBA through the mailing list.&lt;br /&gt;
&lt;br /&gt;
Discussions occur on the [mailto:x3d-public@web3d.org x3d-public@web3d.org mailing list].  If the email traffic becomes very busy then we can create a separate email list.&lt;br /&gt;
&lt;br /&gt;
Meeting minutes are also distributed on the X3D-Public mailing list and [http://web3d.org/pipermail/x3d-public_web3d.org/ archived online].&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
== Existing Proposals ==&lt;br /&gt;
&lt;br /&gt;
=== Instant Reality ===&lt;br /&gt;
&lt;br /&gt;
A Mixed Reality framework developed and maintained by Fraunhofer IGD, which uses X3D as application description language and which thus also provides corresponding nodes and concepts for developing AR/MR applications. The full documentation is on [http://doc.instantreality.org/documentation/ IR-Docs].&lt;br /&gt;
&lt;br /&gt;
There also exist several tutorials on vision-based tracking with Instant Reality, which e.g. describe specific nodes like the IOSensor for retrieving the camera streams and the tracking results of the vision subsystem, as well as the PolygonBackground for displaying the camera images behind the virtual objects or some useful camera extensions to the X3D Viewpoint node etc: [http://doc.instantreality.org/tutorial/ Tracking-Tutorial].&lt;br /&gt;
&lt;br /&gt;
In addition, some papers on AR and MR rendering already were pubished at the Web3D conferences. Here, e.g. occlusions, shadows and lighting in MR scenes were discussed in the context of X3D:&lt;br /&gt;
* [http://www.web3d.org/x3d/wiki/images/6/63/07X3DforMR.pdf MR-Paper-2007]&lt;br /&gt;
* [http://www.web3d.org/x3d/wiki/images/0/0d/07X3DforMR_slides.pdf MR-Paper-Slides]&lt;br /&gt;
* [http://www.web3d.org/x3d/wiki/images/0/0a/08PRTforX3D.pdf MR-Paper-2008]&lt;br /&gt;
There are moreover some more ideas on realistic MR rendering in X3D outlined in chapter 6 (p. 163 ff.), and especially in section 6.4, in the following PhD thesis:&lt;br /&gt;
[http://tuprints.ulb.tu-darmstadt.de/2489/ Pdf].&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
=== Korean Chapter ===&lt;br /&gt;
&lt;br /&gt;
TODO&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
== Participants ==&lt;br /&gt;
&lt;br /&gt;
* Anita Havele&lt;br /&gt;
* Damon Hernandez&lt;br /&gt;
* Don Brutzman&lt;br /&gt;
* Gerard J. Kim&lt;br /&gt;
* Gun Lee&lt;br /&gt;
* Len Daly, Daly Realism&lt;br /&gt;
* Timo Engelke&lt;br /&gt;
* Sabine Webel&lt;br /&gt;
* Yvonne Jung&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
== References ==&lt;br /&gt;
&lt;br /&gt;
== Participation and Liaisons ==&lt;br /&gt;
&lt;br /&gt;
* Other partnerships can also be considered as appropriate.&lt;br /&gt;
&lt;br /&gt;
* Of interest is that Web3D Consortium Intellectual Property Rights (IPR) Policy insists on open, royalty free (RF) specifications.  These coexist effectively with the Web Architecture and many different business models.&lt;/div&gt;</summary>
		<author><name>Yjung</name></author>	</entry>

	<entry>
		<id>https://old.web3d.org/wiki/index.php?title=X3D_and_Augmented_Reality&amp;diff=3256</id>
		<title>X3D and Augmented Reality</title>
		<link rel="alternate" type="text/html" href="https://old.web3d.org/wiki/index.php?title=X3D_and_Augmented_Reality&amp;diff=3256"/>
				<updated>2011-03-21T10:14:47Z</updated>
		
		<summary type="html">&lt;p&gt;Yjung: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;== History ==&lt;br /&gt;
Web3D Consortium has a special interest in the recent AR initiatives and creating the AR Technology Road Map.&lt;br /&gt;
&lt;br /&gt;
We have several member projects showcasing the feasibility of AR in X3D, particularly [http://www.x3dom.org X3DOM] open source produced by [http://www.igd.fhg.de/www/igd-a4 Fraunhofer IGD]. &lt;br /&gt;
&lt;br /&gt;
* X3DOM can serve as an out of the box, standards-based solution for AR developers.&lt;br /&gt;
&lt;br /&gt;
* X3D and X3DOM continue to improve scripting options for X3D with HTML5 and this has good promise for AR applications. &lt;br /&gt;
&lt;br /&gt;
* X3DOM is being considered for potential standardization in a Mobile and/or Augmented Reality (AR) profile for X3D.&lt;br /&gt;
&lt;br /&gt;
Our [http://www.web3d.kr Web3D Korea Chapter members] from ETRI are working on Mixed Reality visualization in X3D.&lt;br /&gt;
&lt;br /&gt;
* The Consortium as been working closely within W3C HTML5 WG to align our standards for 3D visualization on the Web. &lt;br /&gt;
&lt;br /&gt;
Additional details are available at: &lt;br /&gt;
&lt;br /&gt;
* [http://www.web3d.org/x3d/wiki/index.php/X3D_and_HTML5 X3D and HTML5]&lt;br /&gt;
&lt;br /&gt;
* [http://www.web3d.org/x3d/wiki/index.php/X3D_and_HTML_FAQ X3D and HTML FAQ]&lt;br /&gt;
&lt;br /&gt;
* [http://www.x3dom.org X3DOM]&lt;br /&gt;
&lt;br /&gt;
Web3D Consortium is also engaged with the [http://www.arstandards.org AR standards], [http://opengeospacial.org  OGC], [http://www.khronos.org Khronos] and [http://w3c.org W3C] organizations for applying and adapting X3D.&lt;br /&gt;
&lt;br /&gt;
* Web3D Consortium will be starting an AR working group in March 2011 to develop a roadmap for AR standardization. We encourage your participation. &lt;br /&gt;
&lt;br /&gt;
* The technology discussions and meetings will be conducted publicly so that the widest possible comment and reaction can be considered from the Community. Feedback from this community will help X3D quickly and stably adopt new technologies to provide ongoing archival value for all 3D graphics.&lt;br /&gt;
&lt;br /&gt;
* Meanwhile Web3D Members still retain important membership rights of proposing significant new technology and considering patented technologies within a &amp;quot;safe haven&amp;quot; prior to public release.&lt;br /&gt;
&lt;br /&gt;
[http://www.web3d.org/x3d/wiki/images/3/32/X3dProfilePossibilitiesMobileHtml5AR.2010June29.pdf Slideset summary]&lt;br /&gt;
from last summer's Mobile X3D ISO Workshop has also been&lt;br /&gt;
linked: how Mobile, HTML5 and possibly Augmented Reality (AR) components&lt;br /&gt;
might be alignable together as an X3D Mobile Profile.&lt;br /&gt;
&lt;br /&gt;
Many new Web3D capabilities are becoming available.  There has been no better time to discuss X3D technologies and also join the [http://web3d.org/membersip/join Web3D Consortium]. Get involved in these early discussions to create the AR Technology Road Map. We look forward to your participation.&lt;br /&gt;
&lt;br /&gt;
= Draft Charter =&lt;br /&gt;
&lt;br /&gt;
Adding points to this section will be the subject of the next meetings.&lt;br /&gt;
&lt;br /&gt;
== Overview ==&lt;br /&gt;
The Augmented Reality Working/Special Interest Group focuses on utilizing/extending X3D functions to support augmented and mixed reality applications.&lt;br /&gt;
&lt;br /&gt;
== Discussions ==&lt;br /&gt;
These X3D AR discussion are part of a special interest group. This effort is moving forward by forming the X3D AR Working Group. The charter, roles and responsibilities are being defined.&lt;br /&gt;
&lt;br /&gt;
== Goals ==&lt;br /&gt;
Planned goals of AR WG/SIG include:&lt;br /&gt;
* Collect requirements and describe typical use cases of using X3D for AR/MR application.&lt;br /&gt;
* Develop and extend functions and nodes for X3D specification required to support AR/MR&lt;br /&gt;
* Produce sample AR/MR applications with X3D&lt;br /&gt;
&lt;br /&gt;
== Tasks ==&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
== Deliverables and Timeline ==&lt;br /&gt;
* Use cases of X3D for AR/MR application - May, 2011 TBD&lt;br /&gt;
* Requirements for X3D to support AR/MR applications - May, 2011 TBD&lt;br /&gt;
* Proposed new/extended functions and nodes for X3D specification - TBD&lt;br /&gt;
* Sample AR/MR applications with X3D - TBD&lt;br /&gt;
&lt;br /&gt;
== Meetings ==&lt;br /&gt;
&lt;br /&gt;
Our twice-monthly teleconference for X3D and Augmented Reality is usually&lt;br /&gt;
* Together with Korea Chapter Meeting:&lt;br /&gt;
1710-1810 (Pacific time)/2010-2110 (Eastern time)  on 1st Wednesday, which is 0910-1010 (Korea time) on 1st Thursday.&lt;br /&gt;
* AR only Meeting:&lt;br /&gt;
1710-1810 (Pacific time)/2010-2110 (Eastern time)  on 3rd Tuesday, which is 0910-1010 (Korea time) on 3rd Wednesday.&lt;br /&gt;
&lt;br /&gt;
Participation is open to everyone via the Web3D teleconference line.  Non-members can [mailto:anita.havele@web3d.org?subject=AR%20teleconference%20request request access information] for this call, or [http://web3d.org/membership/join Join Web3D]!&lt;br /&gt;
&lt;br /&gt;
Our next public teleconference is 15/16 March 2011.&lt;br /&gt;
&lt;br /&gt;
Meeting agenda:&lt;br /&gt;
TBA through the mailing list.&lt;br /&gt;
&lt;br /&gt;
Discussions occur on the [mailto:x3d-public@web3d.org x3d-public@web3d.org mailing list].  If the email traffic becomes very busy then we can create a separate email list.&lt;br /&gt;
&lt;br /&gt;
Meeting minutes are also distributed on the X3D-Public mailing list and [http://web3d.org/pipermail/x3d-public_web3d.org/ archived online].&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
== Existing Proposals ==&lt;br /&gt;
&lt;br /&gt;
=== Instant Reality ===&lt;br /&gt;
&lt;br /&gt;
A Mixed Reality framework developed and maintained by Fraunhofer IGD, which uses X3D as application description language and which thus also provides corresponding nodes and concepts for developing AR/MR applications. The full documentation is on [http://doc.instantreality.org/documentation/ IR-Docs].&lt;br /&gt;
&lt;br /&gt;
There also exist several tutorials on vision-based tracking with Instant Reality, which e.g. describe specific nodes like the IOSensor for retrieving the camera streams and the tracking results of the vision subsystem, as well as the PolygonBackground for displaying the camera images behind the virtual objects or some useful camera extensions to the X3D Viewpoint node etc: [http://doc.instantreality.org/tutorial/ Tracking-Tutorial].&lt;br /&gt;
&lt;br /&gt;
In addition, some papers on AR and MR rendering already were pubished at the Web3D conferences. Here, e.g. occlusions, shadows and lighting in MR scenes were discussed in the context of X3D:&lt;br /&gt;
* [http://www.web3d.org/x3d/wiki/images/6/63/07X3DforMR.pdf MR-Paper-2007]&lt;br /&gt;
* [http://www.web3d.org/x3d/wiki/images/0/0d/07X3DforMR_slides.pdf MR-Paper-Slides]&lt;br /&gt;
* [http://www.web3d.org/x3d/wiki/images/0/0a/08PRTforX3D.pdf MR-Paper-2008]&lt;br /&gt;
There are moreover some more ideas on realistic MR rendering in X3D outlined in chapter X, especially section y, in the following PhD thesis:&lt;br /&gt;
[http://tuprints.ulb.tu-darmstadt.de/2489/ &lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
=== Korean Chapter ===&lt;br /&gt;
&lt;br /&gt;
TODO&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
== Participants ==&lt;br /&gt;
&lt;br /&gt;
* Anita Havele&lt;br /&gt;
* Damon Hernandez&lt;br /&gt;
* Don Brutzman&lt;br /&gt;
* Gerard J. Kim&lt;br /&gt;
* Gun Lee&lt;br /&gt;
* Len Daly, Daly Realism&lt;br /&gt;
* Timo Engelke&lt;br /&gt;
* Sabine Webel&lt;br /&gt;
* Yvonne Jung&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
== References ==&lt;br /&gt;
&lt;br /&gt;
== Participation and Liaisons ==&lt;br /&gt;
&lt;br /&gt;
* Other partnerships can also be considered as appropriate.&lt;br /&gt;
&lt;br /&gt;
* Of interest is that Web3D Consortium Intellectual Property Rights (IPR) Policy insists on open, royalty free (RF) specifications.  These coexist effectively with the Web Architecture and many different business models.&lt;/div&gt;</summary>
		<author><name>Yjung</name></author>	</entry>

	<entry>
		<id>https://old.web3d.org/wiki/index.php?title=X3D_and_Augmented_Reality&amp;diff=3255</id>
		<title>X3D and Augmented Reality</title>
		<link rel="alternate" type="text/html" href="https://old.web3d.org/wiki/index.php?title=X3D_and_Augmented_Reality&amp;diff=3255"/>
				<updated>2011-03-21T09:53:13Z</updated>
		
		<summary type="html">&lt;p&gt;Yjung: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;== History ==&lt;br /&gt;
Web3D Consortium has a special interest in the recent AR initiatives and creating the AR Technology Road Map.&lt;br /&gt;
&lt;br /&gt;
We have several member projects showcasing the feasibility of AR in X3D, particularly [http://www.x3dom.org X3DOM] open source produced by [http://www.igd.fhg.de/www/igd-a4 Fraunhofer IGD]. &lt;br /&gt;
&lt;br /&gt;
* X3DOM can serve as an out of the box, standards-based solution for AR developers.&lt;br /&gt;
&lt;br /&gt;
* X3D and X3DOM continue to improve scripting options for X3D with HTML5 and this has good promise for AR applications. &lt;br /&gt;
&lt;br /&gt;
* X3DOM is being considered for potential standardization in a Mobile and/or Augmented Reality (AR) profile for X3D.&lt;br /&gt;
&lt;br /&gt;
Our [http://www.web3d.kr Web3D Korea Chapter members] from ETRI are working on Mixed Reality visualization in X3D.&lt;br /&gt;
&lt;br /&gt;
* The Consortium as been working closely within W3C HTML5 WG to align our standards for 3D visualization on the Web. &lt;br /&gt;
&lt;br /&gt;
Additional details are available at: &lt;br /&gt;
&lt;br /&gt;
* [http://www.web3d.org/x3d/wiki/index.php/X3D_and_HTML5 X3D and HTML5]&lt;br /&gt;
&lt;br /&gt;
* [http://www.web3d.org/x3d/wiki/index.php/X3D_and_HTML_FAQ X3D and HTML FAQ]&lt;br /&gt;
&lt;br /&gt;
* [http://www.x3dom.org X3DOM]&lt;br /&gt;
&lt;br /&gt;
Web3D Consortium is also engaged with the [http://www.arstandards.org AR standards], [http://opengeospacial.org  OGC], [http://www.khronos.org Khronos] and [http://w3c.org W3C] organizations for applying and adapting X3D.&lt;br /&gt;
&lt;br /&gt;
* Web3D Consortium will be starting an AR working group in March 2011 to develop a roadmap for AR standardization. We encourage your participation. &lt;br /&gt;
&lt;br /&gt;
* The technology discussions and meetings will be conducted publicly so that the widest possible comment and reaction can be considered from the Community. Feedback from this community will help X3D quickly and stably adopt new technologies to provide ongoing archival value for all 3D graphics.&lt;br /&gt;
&lt;br /&gt;
* Meanwhile Web3D Members still retain important membership rights of proposing significant new technology and considering patented technologies within a &amp;quot;safe haven&amp;quot; prior to public release.&lt;br /&gt;
&lt;br /&gt;
[http://www.web3d.org/x3d/wiki/images/3/32/X3dProfilePossibilitiesMobileHtml5AR.2010June29.pdf Slideset summary]&lt;br /&gt;
from last summer's Mobile X3D ISO Workshop has also been&lt;br /&gt;
linked: how Mobile, HTML5 and possibly Augmented Reality (AR) components&lt;br /&gt;
might be alignable together as an X3D Mobile Profile.&lt;br /&gt;
&lt;br /&gt;
Many new Web3D capabilities are becoming available.  There has been no better time to discuss X3D technologies and also join the [http://web3d.org/membersip/join Web3D Consortium]. Get involved in these early discussions to create the AR Technology Road Map. We look forward to your participation.&lt;br /&gt;
&lt;br /&gt;
= Draft Charter =&lt;br /&gt;
&lt;br /&gt;
Adding points to this section will be the subject of the next meetings.&lt;br /&gt;
&lt;br /&gt;
== Overview ==&lt;br /&gt;
The Augmented Reality Working/Special Interest Group focuses on utilizing/extending X3D functions to support augmented and mixed reality applications.&lt;br /&gt;
&lt;br /&gt;
== Discussions ==&lt;br /&gt;
These X3D AR discussion are part of a special interest group. This effort is moving forward by forming the X3D AR Working Group. The charter, roles and responsibilities are being defined.&lt;br /&gt;
&lt;br /&gt;
== Goals ==&lt;br /&gt;
Planned goals of AR WG/SIG include:&lt;br /&gt;
* Collect requirements and describe typical use cases of using X3D for AR/MR application.&lt;br /&gt;
* Develop and extend functions and nodes for X3D specification required to support AR/MR&lt;br /&gt;
* Produce sample AR/MR applications with X3D&lt;br /&gt;
&lt;br /&gt;
== Tasks ==&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
== Deliverables and Timeline ==&lt;br /&gt;
* Use cases of X3D for AR/MR application - May, 2011 TBD&lt;br /&gt;
* Requirements for X3D to support AR/MR applications - May, 2011 TBD&lt;br /&gt;
* Proposed new/extended functions and nodes for X3D specification - TBD&lt;br /&gt;
* Sample AR/MR applications with X3D - TBD&lt;br /&gt;
&lt;br /&gt;
== Meetings ==&lt;br /&gt;
&lt;br /&gt;
Our twice-monthly teleconference for X3D and Augmented Reality is usually&lt;br /&gt;
* Together with Korea Chapter Meeting:&lt;br /&gt;
1710-1810 (Pacific time)/2010-2110 (Eastern time)  on 1st Wednesday, which is 0910-1010 (Korea time) on 1st Thursday.&lt;br /&gt;
* AR only Meeting:&lt;br /&gt;
1710-1810 (Pacific time)/2010-2110 (Eastern time)  on 3rd Tuesday, which is 0910-1010 (Korea time) on 3rd Wednesday.&lt;br /&gt;
&lt;br /&gt;
Participation is open to everyone via the Web3D teleconference line.  Non-members can [mailto:anita.havele@web3d.org?subject=AR%20teleconference%20request request access information] for this call, or [http://web3d.org/membership/join Join Web3D]!&lt;br /&gt;
&lt;br /&gt;
Our next public teleconference is 15/16 March 2011.&lt;br /&gt;
&lt;br /&gt;
Meeting agenda:&lt;br /&gt;
TBA through the mailing list.&lt;br /&gt;
&lt;br /&gt;
Discussions occur on the [mailto:x3d-public@web3d.org x3d-public@web3d.org mailing list].  If the email traffic becomes very busy then we can create a separate email list.&lt;br /&gt;
&lt;br /&gt;
Meeting minutes are also distributed on the X3D-Public mailing list and [http://web3d.org/pipermail/x3d-public_web3d.org/ archived online].&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
== Existing Proposals ==&lt;br /&gt;
&lt;br /&gt;
=== Instant Reality ===&lt;br /&gt;
&lt;br /&gt;
A Mixed Reality framework developed and maintained by Fraunhofer IGD, which uses X3D as application description language and which thus also provides corresponding nodes and concepts for developing AR/MR applications. The full documentation is on [http://doc.instantreality.org/documentation/ IR-Docs].&lt;br /&gt;
&lt;br /&gt;
There also exist several tutorials on vision-based tracking with Instant Reality, which e.g. describe specific nodes like the IOSensor for retrieving the camera streams and the tracking results of the vision subsystem, as well as the PolygonBackground for displaying the camera images behind the virtual objects or some useful camera extensions to the X3D Viewpoint node etc: [http://doc.instantreality.org/tutorial/ Tracking-Tutorial].&lt;br /&gt;
&lt;br /&gt;
In addition, some papers on AR and MR rendering already were pubished at the Web3D conferences. Here, e.g. occlusions, shadows and lighting in MR scenes were discussed in the context of X3D:&lt;br /&gt;
* [http://www.web3d.org/x3d/wiki/images/6/63/07X3DforMR.pdf MR-Paper-2007]&lt;br /&gt;
* [http://www.web3d.org/x3d/wiki/images/0/0d/07X3DforMR_slides.pdf MR-Paper-Slides]&lt;br /&gt;
* [http://www.web3d.org/x3d/wiki/images/0/0a/08PRTforX3D.pdf MR-Paper-2008]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
=== Korean Chapter ===&lt;br /&gt;
&lt;br /&gt;
TODO&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
== Participants ==&lt;br /&gt;
&lt;br /&gt;
* Anita Havele&lt;br /&gt;
* Damon Hernandez&lt;br /&gt;
* Don Brutzman&lt;br /&gt;
* Gerard J. Kim&lt;br /&gt;
* Gun Lee&lt;br /&gt;
* Len Daly, Daly Realism&lt;br /&gt;
* Timo Engelke&lt;br /&gt;
* Sabine Webel&lt;br /&gt;
* Yvonne Jung&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
== References ==&lt;br /&gt;
&lt;br /&gt;
== Participation and Liaisons ==&lt;br /&gt;
&lt;br /&gt;
* Other partnerships can also be considered as appropriate.&lt;br /&gt;
&lt;br /&gt;
* Of interest is that Web3D Consortium Intellectual Property Rights (IPR) Policy insists on open, royalty free (RF) specifications.  These coexist effectively with the Web Architecture and many different business models.&lt;/div&gt;</summary>
		<author><name>Yjung</name></author>	</entry>

	<entry>
		<id>https://old.web3d.org/wiki/index.php?title=X3D_and_Augmented_Reality&amp;diff=3254</id>
		<title>X3D and Augmented Reality</title>
		<link rel="alternate" type="text/html" href="https://old.web3d.org/wiki/index.php?title=X3D_and_Augmented_Reality&amp;diff=3254"/>
				<updated>2011-03-21T09:44:57Z</updated>
		
		<summary type="html">&lt;p&gt;Yjung: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;== History ==&lt;br /&gt;
Web3D Consortium has a special interest in the recent AR initiatives and creating the AR Technology Road Map.&lt;br /&gt;
&lt;br /&gt;
We have several member projects showcasing the feasibility of AR in X3D, particularly [http://www.x3dom.org X3DOM] open source produced by [http://www.igd.fhg.de/www/igd-a4 Fraunhofer IGD]. &lt;br /&gt;
&lt;br /&gt;
* X3DOM can serve as an out of the box, standards-based solution for AR developers.&lt;br /&gt;
&lt;br /&gt;
* X3D and X3DOM continue to improve scripting options for X3D with HTML5 and this has good promise for AR applications. &lt;br /&gt;
&lt;br /&gt;
* X3DOM is being considered for potential standardization in a Mobile and/or Augmented Reality (AR) profile for X3D.&lt;br /&gt;
&lt;br /&gt;
Our [http://www.web3d.kr Web3D Korea Chapter members] from ETRI are working on Mixed Reality visualization in X3D.&lt;br /&gt;
&lt;br /&gt;
* The Consortium as been working closely within W3C HTML5 WG to align our standards for 3D visualization on the Web. &lt;br /&gt;
&lt;br /&gt;
Additional details are available at: &lt;br /&gt;
&lt;br /&gt;
* [http://www.web3d.org/x3d/wiki/index.php/X3D_and_HTML5 X3D and HTML5]&lt;br /&gt;
&lt;br /&gt;
* [http://www.web3d.org/x3d/wiki/index.php/X3D_and_HTML_FAQ X3D and HTML FAQ]&lt;br /&gt;
&lt;br /&gt;
* [http://www.x3dom.org X3DOM]&lt;br /&gt;
&lt;br /&gt;
Web3D Consortium is also engaged with the [http://www.arstandards.org AR standards], [http://opengeospacial.org  OGC], [http://www.khronos.org Khronos] and [http://w3c.org W3C] organizations for applying and adapting X3D.&lt;br /&gt;
&lt;br /&gt;
* Web3D Consortium will be starting an AR working group in March 2011 to develop a roadmap for AR standardization. We encourage your participation. &lt;br /&gt;
&lt;br /&gt;
* The technology discussions and meetings will be conducted publicly so that the widest possible comment and reaction can be considered from the Community. Feedback from this community will help X3D quickly and stably adopt new technologies to provide ongoing archival value for all 3D graphics.&lt;br /&gt;
&lt;br /&gt;
* Meanwhile Web3D Members still retain important membership rights of proposing significant new technology and considering patented technologies within a &amp;quot;safe haven&amp;quot; prior to public release.&lt;br /&gt;
&lt;br /&gt;
[http://www.web3d.org/x3d/wiki/images/3/32/X3dProfilePossibilitiesMobileHtml5AR.2010June29.pdf Slideset summary]&lt;br /&gt;
from last summer's Mobile X3D ISO Workshop has also been&lt;br /&gt;
linked: how Mobile, HTML5 and possibly Augmented Reality (AR) components&lt;br /&gt;
might be alignable together as an X3D Mobile Profile.&lt;br /&gt;
&lt;br /&gt;
Many new Web3D capabilities are becoming available.  There has been no better time to discuss X3D technologies and also join the [http://web3d.org/membersip/join Web3D Consortium]. Get involved in these early discussions to create the AR Technology Road Map. We look forward to your participation.&lt;br /&gt;
&lt;br /&gt;
= Draft Charter =&lt;br /&gt;
&lt;br /&gt;
Adding points to this section will be the subject of the next meetings.&lt;br /&gt;
&lt;br /&gt;
== Overview ==&lt;br /&gt;
The Augmented Reality Working/Special Interest Group focuses on utilizing/extending X3D functions to support augmented and mixed reality applications.&lt;br /&gt;
&lt;br /&gt;
== Discussions ==&lt;br /&gt;
These X3D AR discussion are part of a special interest group. This effort is moving forward by forming the X3D AR Working Group. The charter, roles and responsibilities are being defined.&lt;br /&gt;
&lt;br /&gt;
== Goals ==&lt;br /&gt;
Planned goals of AR WG/SIG include:&lt;br /&gt;
* Collect requirements and describe typical use cases of using X3D for AR/MR application.&lt;br /&gt;
* Develop and extend functions and nodes for X3D specification required to support AR/MR&lt;br /&gt;
* Produce sample AR/MR applications with X3D&lt;br /&gt;
&lt;br /&gt;
== Tasks ==&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
== Deliverables and Timeline ==&lt;br /&gt;
* Use cases of X3D for AR/MR application - May, 2011 TBD&lt;br /&gt;
* Requirements for X3D to support AR/MR applications - May, 2011 TBD&lt;br /&gt;
* Proposed new/extended functions and nodes for X3D specification - TBD&lt;br /&gt;
* Sample AR/MR applications with X3D - TBD&lt;br /&gt;
&lt;br /&gt;
== Meetings ==&lt;br /&gt;
&lt;br /&gt;
Our twice-monthly teleconference for X3D and Augmented Reality is usually&lt;br /&gt;
* Together with Korea Chapter Meeting:&lt;br /&gt;
1710-1810 (Pacific time)/2010-2110 (Eastern time)  on 1st Wednesday, which is 0910-1010 (Korea time) on 1st Thursday.&lt;br /&gt;
* AR only Meeting:&lt;br /&gt;
1710-1810 (Pacific time)/2010-2110 (Eastern time)  on 3rd Tuesday, which is 0910-1010 (Korea time) on 3rd Wednesday.&lt;br /&gt;
&lt;br /&gt;
Participation is open to everyone via the Web3D teleconference line.  Non-members can [mailto:anita.havele@web3d.org?subject=AR%20teleconference%20request request access information] for this call, or [http://web3d.org/membership/join Join Web3D]!&lt;br /&gt;
&lt;br /&gt;
Our next public teleconference is 15/16 March 2011.&lt;br /&gt;
&lt;br /&gt;
Meeting agenda:&lt;br /&gt;
TBA through the mailing list.&lt;br /&gt;
&lt;br /&gt;
Discussions occur on the [mailto:x3d-public@web3d.org x3d-public@web3d.org mailing list].  If the email traffic becomes very busy then we can create a separate email list.&lt;br /&gt;
&lt;br /&gt;
Meeting minutes are also distributed on the X3D-Public mailing list and [http://web3d.org/pipermail/x3d-public_web3d.org/ archived online].&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
== Existing Proposals ==&lt;br /&gt;
&lt;br /&gt;
=== Instant Reality ===&lt;br /&gt;
A Mixed Reality framework developed and maintained by Fraunhofer IGD, which uses X3D as application description language and which thus also provides corresponding nodes and concepts for developing AR/MR applications. The full documentation is on [http://doc.instantreality.org/documentation/ IR-Docs].&lt;br /&gt;
&lt;br /&gt;
In addition, some papers on AR and MR rendering already were pubished at the Web3D conferences:&lt;br /&gt;
* [http://www.web3d.org/x3d/wiki/images/6/63/07X3DforMR.pdf MR-Paper-2007]&lt;br /&gt;
* [http://www.web3d.org/x3d/wiki/images/0/0d/07X3DforMR_slides.pdf MR-Paper-Slides]&lt;br /&gt;
* [http://www.web3d.org/x3d/wiki/images/0/0a/08PRTforX3D.pdf MR-Paper-2008]&lt;br /&gt;
&lt;br /&gt;
=== Korean Chapter ===&lt;br /&gt;
&lt;br /&gt;
TODO&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
== Participants ==&lt;br /&gt;
&lt;br /&gt;
* Anita Havele&lt;br /&gt;
* Damon Hernandez&lt;br /&gt;
* Don Brutzman&lt;br /&gt;
* Gerard J. Kim&lt;br /&gt;
* Gun Lee&lt;br /&gt;
* Len Daly, Daly Realism&lt;br /&gt;
* Timo Engelke&lt;br /&gt;
* Sabine Webel&lt;br /&gt;
* Yvonne Jung&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
== References ==&lt;br /&gt;
&lt;br /&gt;
== Participation and Liaisons ==&lt;br /&gt;
&lt;br /&gt;
* Other partnerships can also be considered as appropriate.&lt;br /&gt;
&lt;br /&gt;
* Of interest is that Web3D Consortium Intellectual Property Rights (IPR) Policy insists on open, royalty free (RF) specifications.  These coexist effectively with the Web Architecture and many different business models.&lt;/div&gt;</summary>
		<author><name>Yjung</name></author>	</entry>

	<entry>
		<id>https://old.web3d.org/wiki/index.php?title=File:08PRTforX3D.pdf&amp;diff=3253</id>
		<title>File:08PRTforX3D.pdf</title>
		<link rel="alternate" type="text/html" href="https://old.web3d.org/wiki/index.php?title=File:08PRTforX3D.pdf&amp;diff=3253"/>
				<updated>2011-03-21T09:38:44Z</updated>
		
		<summary type="html">&lt;p&gt;Yjung: Web3D 2009 Paper on PRT and MR&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;Web3D 2009 Paper on PRT and MR&lt;/div&gt;</summary>
		<author><name>Yjung</name></author>	</entry>

	<entry>
		<id>https://old.web3d.org/wiki/index.php?title=File:07X3DforMR_slides.pdf&amp;diff=3252</id>
		<title>File:07X3DforMR slides.pdf</title>
		<link rel="alternate" type="text/html" href="https://old.web3d.org/wiki/index.php?title=File:07X3DforMR_slides.pdf&amp;diff=3252"/>
				<updated>2011-03-21T09:37:44Z</updated>
		
		<summary type="html">&lt;p&gt;Yjung: Slides for Web3D 2007 MR Paper&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;Slides for Web3D 2007 MR Paper&lt;/div&gt;</summary>
		<author><name>Yjung</name></author>	</entry>

	<entry>
		<id>https://old.web3d.org/wiki/index.php?title=File:07X3DforMR.pdf&amp;diff=3251</id>
		<title>File:07X3DforMR.pdf</title>
		<link rel="alternate" type="text/html" href="https://old.web3d.org/wiki/index.php?title=File:07X3DforMR.pdf&amp;diff=3251"/>
				<updated>2011-03-21T09:36:41Z</updated>
		
		<summary type="html">&lt;p&gt;Yjung: Web3D 2007 Paper on Mixed Reality&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;Web3D 2007 Paper on Mixed Reality&lt;/div&gt;</summary>
		<author><name>Yjung</name></author>	</entry>

	<entry>
		<id>https://old.web3d.org/wiki/index.php?title=X3D_and_Augmented_Reality&amp;diff=3250</id>
		<title>X3D and Augmented Reality</title>
		<link rel="alternate" type="text/html" href="https://old.web3d.org/wiki/index.php?title=X3D_and_Augmented_Reality&amp;diff=3250"/>
				<updated>2011-03-21T09:24:25Z</updated>
		
		<summary type="html">&lt;p&gt;Yjung: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;== History ==&lt;br /&gt;
Web3D Consortium has a special interest in the recent AR initiatives and creating the AR Technology Road Map.&lt;br /&gt;
&lt;br /&gt;
We have several member projects showcasing the feasibility of AR in X3D, particularly [http://www.x3dom.org X3DOM] open source produced by [http://www.igd.fhg.de/www/igd-a4 Fraunhofer IGD]. &lt;br /&gt;
&lt;br /&gt;
* X3DOM can serve as an out of the box, standards-based solution for AR developers.&lt;br /&gt;
&lt;br /&gt;
* X3D and X3DOM continue to improve scripting options for X3D with HTML5 and this has good promise for AR applications. &lt;br /&gt;
&lt;br /&gt;
* X3DOM is being considered for potential standardization in a Mobile and/or Augmented Reality (AR) profile for X3D.&lt;br /&gt;
&lt;br /&gt;
Our [http://www.web3d.kr Web3D Korea Chapter members] from ETRI are working on Mixed Reality visualization in X3D.&lt;br /&gt;
&lt;br /&gt;
* The Consortium as been working closely within W3C HTML5 WG to align our standards for 3D visualization on the Web. &lt;br /&gt;
&lt;br /&gt;
Additional details are available at: &lt;br /&gt;
&lt;br /&gt;
* [http://www.web3d.org/x3d/wiki/index.php/X3D_and_HTML5 X3D and HTML5]&lt;br /&gt;
&lt;br /&gt;
* [http://www.web3d.org/x3d/wiki/index.php/X3D_and_HTML_FAQ X3D and HTML FAQ]&lt;br /&gt;
&lt;br /&gt;
* [http://www.x3dom.org X3DOM]&lt;br /&gt;
&lt;br /&gt;
Web3D Consortium is also engaged with the [http://www.arstandards.org AR standards], [http://opengeospacial.org  OGC], [http://www.khronos.org Khronos] and [http://w3c.org W3C] organizations for applying and adapting X3D.&lt;br /&gt;
&lt;br /&gt;
* Web3D Consortium will be starting an AR working group in March 2011 to develop a roadmap for AR standardization. We encourage your participation. &lt;br /&gt;
&lt;br /&gt;
* The technology discussions and meetings will be conducted publicly so that the widest possible comment and reaction can be considered from the Community. Feedback from this community will help X3D quickly and stably adopt new technologies to provide ongoing archival value for all 3D graphics.&lt;br /&gt;
&lt;br /&gt;
* Meanwhile Web3D Members still retain important membership rights of proposing significant new technology and considering patented technologies within a &amp;quot;safe haven&amp;quot; prior to public release.&lt;br /&gt;
&lt;br /&gt;
[http://www.web3d.org/x3d/wiki/images/3/32/X3dProfilePossibilitiesMobileHtml5AR.2010June29.pdf Slideset summary]&lt;br /&gt;
from last summer's Mobile X3D ISO Workshop has also been&lt;br /&gt;
linked: how Mobile, HTML5 and possibly Augmented Reality (AR) components&lt;br /&gt;
might be alignable together as an X3D Mobile Profile.&lt;br /&gt;
&lt;br /&gt;
Many new Web3D capabilities are becoming available.  There has been no better time to discuss X3D technologies and also join the [http://web3d.org/membersip/join Web3D Consortium]. Get involved in these early discussions to create the AR Technology Road Map. We look forward to your participation.&lt;br /&gt;
&lt;br /&gt;
= Draft Charter =&lt;br /&gt;
&lt;br /&gt;
Adding points to this section will be the subject of the next meetings.&lt;br /&gt;
&lt;br /&gt;
== Overview ==&lt;br /&gt;
The Augmented Reality Working/Special Interest Group focuses on utilizing/extending X3D functions to support augmented and mixed reality applications.&lt;br /&gt;
&lt;br /&gt;
== Discussions ==&lt;br /&gt;
These X3D AR discussion are part of a special interest group. This effort is moving forward by forming the X3D AR Working Group. The charter, roles and responsibilities are being defined.&lt;br /&gt;
&lt;br /&gt;
== Goals ==&lt;br /&gt;
Planned goals of AR WG/SIG include:&lt;br /&gt;
* Collect requirements and describe typical use cases of using X3D for AR/MR application.&lt;br /&gt;
* Develop and extend functions and nodes for X3D specification required to support AR/MR&lt;br /&gt;
* Produce sample AR/MR applications with X3D&lt;br /&gt;
&lt;br /&gt;
== Tasks ==&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
== Deliverables and Timeline ==&lt;br /&gt;
* Use cases of X3D for AR/MR application - May, 2011 TBD&lt;br /&gt;
* Requirements for X3D to support AR/MR applications - May, 2011 TBD&lt;br /&gt;
* Proposed new/extended functions and nodes for X3D specification - TBD&lt;br /&gt;
* Sample AR/MR applications with X3D - TBD&lt;br /&gt;
&lt;br /&gt;
== Meetings ==&lt;br /&gt;
&lt;br /&gt;
Our twice-monthly teleconference for X3D and Augmented Reality is usually&lt;br /&gt;
* Together with Korea Chapter Meeting:&lt;br /&gt;
1710-1810 (Pacific time)/2010-2110 (Eastern time)  on 1st Wednesday, which is 0910-1010 (Korea time) on 1st Thursday.&lt;br /&gt;
* AR only Meeting:&lt;br /&gt;
1710-1810 (Pacific time)/2010-2110 (Eastern time)  on 3rd Tuesday, which is 0910-1010 (Korea time) on 3rd Wednesday.&lt;br /&gt;
&lt;br /&gt;
Participation is open to everyone via the Web3D teleconference line.  Non-members can [mailto:anita.havele@web3d.org?subject=AR%20teleconference%20request request access information] for this call, or [http://web3d.org/membership/join Join Web3D]!&lt;br /&gt;
&lt;br /&gt;
Our next public teleconference is 15/16 March 2011.&lt;br /&gt;
&lt;br /&gt;
Meeting agenda:&lt;br /&gt;
TBA through the mailing list.&lt;br /&gt;
&lt;br /&gt;
Discussions occur on the [mailto:x3d-public@web3d.org x3d-public@web3d.org mailing list].  If the email traffic becomes very busy then we can create a separate email list.&lt;br /&gt;
&lt;br /&gt;
Meeting minutes are also distributed on the X3D-Public mailing list and [http://web3d.org/pipermail/x3d-public_web3d.org/ archived online].&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
== Existing Proposals ==&lt;br /&gt;
&lt;br /&gt;
=== Instant Reality ===&lt;br /&gt;
Full documentation on [http://doc.instantreality.org/documentation/ IR-Docs]&lt;br /&gt;
&lt;br /&gt;
TODO&lt;br /&gt;
&lt;br /&gt;
=== Korean Chapter ===&lt;br /&gt;
&lt;br /&gt;
TODO&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
== Participants ==&lt;br /&gt;
&lt;br /&gt;
* Anita Havele&lt;br /&gt;
* Damon Hernandez&lt;br /&gt;
* Don Brutzman&lt;br /&gt;
* Gerard J. Kim&lt;br /&gt;
* Gun Lee&lt;br /&gt;
* Len Daly, Daly Realism&lt;br /&gt;
* Timo Engelke&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
== References ==&lt;br /&gt;
&lt;br /&gt;
== Participation and Liaisons ==&lt;br /&gt;
&lt;br /&gt;
* Other partnerships can also be considered as appropriate.&lt;br /&gt;
&lt;br /&gt;
* Of interest is that Web3D Consortium Intellectual Property Rights (IPR) Policy insists on open, royalty free (RF) specifications.  These coexist effectively with the Web Architecture and many different business models.&lt;/div&gt;</summary>
		<author><name>Yjung</name></author>	</entry>

	</feed>