You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Is your feature request related to a problem? Please describe.
Current implementation of the OGC 3Dtiles pointcloud rendering relies on threejs PointsMaterial where a parameter sizeAttenuation specifies whether points' size is attenuated by the camera depth - therefore consist of a switch whether material.pointSize is expressed as pixel screen units or scene world units.
For better rendering of multi-resolution pointcloud tilesets, it could be helpful to have that point-size-attenuation be dependent on the geometricError or level of the tile within the hierarchy - the same way it is currently implemented within cesiumjs (+ plugins) or potree:
This would probably involves developing a custom pointcloud shader that would account for point-sizing and attenuation per vertex, given the lod-level passed as uniform.
Additional context
Note another useful feature of the Potree Renderer is HQ Splats rendering, which resembles antialiasing, although done differently - see HQSplatRenderer, main related bits of the vertex-shader and fragement-shader highlighted
The text was updated successfully, but these errors were encountered:
Do you have an example of what specifically you're interested in the points behaving like? The Cesium example uses screen-space sizing (PointsMaterial.sizeAttenuation === false) while the potree example(s) use world-space sizing (sizeAttenuation === true). Right now the behavior is left to the default three.js behavior when loading a PNTS or GLTF file with points.
In either case there should be no need for a new shader to scale based on LoD or geometric error. The points material "size" value can be adjusted by a factor based on either tile value. Regarding "HQ splats" from potree - from this issue it seems that the points are just being rendered with the depth of a sphere so the points intersect, though this would require shader modifications.
Either way everything you're suggesting can already be implemented, though, using a plugin or by modifying the tile content geometry using the load-model event.
Is your feature request related to a problem? Please describe.
Current implementation of the OGC 3Dtiles pointcloud rendering relies on threejs PointsMaterial where a parameter
sizeAttenuation
specifieswhether points' size is attenuated by the camera depth
- therefore consist of a switch whethermaterial.pointSize
is expressed as pixel screen units or scene world units.3DTilesRendererJS/src/plugins/three/DebugTilesPlugin.js
Lines 443 to 445 in 8142a2d
Describe the solution you'd like
For better rendering of multi-resolution pointcloud tilesets, it could be helpful to have that point-size-attenuation be dependent on the
geometricError
or level of the tile within the hierarchy - the same way it is currently implemented within cesiumjs (+ plugins) or potree:point attenuation based on geometric error
, see this useful sandcastle demoattenuation = 2 ** lod_level
.Describe alternatives you've considered
This would probably involves developing a custom pointcloud shader that would account for point-sizing and attenuation per vertex, given the lod-level passed as uniform.
Additional context
Note another useful feature of the Potree Renderer is HQ Splats rendering, which resembles antialiasing, although done differently - see HQSplatRenderer, main related bits of the vertex-shader and fragement-shader highlighted
The text was updated successfully, but these errors were encountered: