Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

merge or differentiate ProfilingSonar and ImagingSonar #2

Open
smaria opened this issue Nov 16, 2018 · 8 comments
Open

merge or differentiate ProfilingSonar and ImagingSonar #2

smaria opened this issue Nov 16, 2018 · 8 comments

Comments

@smaria
Copy link
Collaborator

smaria commented Nov 16, 2018

I struggle to identify the difference between the two sonar messages. Both seem to be based on the laser scan messages from sensor_msgs.

Extra fields in ProfilingSonar:

  • float32 range_resolution # [% of range error]
  • float32 time_increment # For a mechanically scanning sonar (...)
  • float32[] ranges # range data [m]
  • float32[] intensities # intensity data [device-specific units]

Extra fields in Imaging Sonar:

  • float32 elevation_angle # [Hz]
  • float32 range_increment # maximum range value [m]
  • SonarBeam[] beams # Imaging sonar data.

It appears two are slightly different implementations of the same concept:
range_resolution <-> range_increment
intensities <-> beams

With my current knowledge of sonar sensing I can not assign a sonar device clearly to either of these messages, and know of no argument why both are needed. As a merge of the two I would suggest https://github.com/smaria/marine_ros_conventions_discussion/blob/master/sensor_msgs/SonarScan.msg

@narcispr
Copy link
Contributor

Profiling sonars measure range. They can be multibeam (several ranges at a time) or mechanical scanning sonars (i.e., they mechanically rotate and provide a range at each angle). Imaging sonars provide an image. It means that for each beam they return hundreds of measures.
I think they are different enough to keep them separated. However, I'm not sure about the fields that they have to contain.

@tfurf
Copy link

tfurf commented Dec 19, 2018

@narcispr, yeah, this can be difficult because some sonars even provide phase information (e.g. each 'cell' is a complex number).

I think writing a message that is generic "enough" in ROS is always going to be hard, given how the ROS msg IDL is limited to very simple patterns (only composition, no sense of inheritance). Limiting the options to ranging vs. imaging systems is a pragmatic approach. (NB that in some ways the profiling sonar is a down-sampling of the imaging sonar --- e.g. a MB return looks like an image but can be processed to be (range, intensity) tuples vs beam angle)

@smaria
Copy link
Collaborator Author

smaria commented Dec 19, 2018

The suggested message type with unified field names can be split back into two, by removing line 32 for the profiling sonar. I believe all other information remains valuable for both cases.

However, I don't recognize an advantage in the split. Whilst more complex sonar data exists, in my experience single- or multi beam intensity data [0] is common enough to warrant being considered in the generic message.

My concern with splitting this into two message types is that packages using the profiling data only would likely also only using the profiling message. This means a typical sensor driver for an intensities capable sonar would have to publish both, leading to a lot of information duplication.
A beginner would not only have to identify which of the two message types should be used, but would also need to realize that it may be necessary to use both to be able to use the range data of an intensities capable sonar. The united message would also stay true to its role model, the laser scan, which also features an intensities field that can be left intentionally empty.

[0] What I call multi beam intensity data seems to be what @narcispr refers to as imaging. As far as I can tell, his distinction does not seem to consider single beam sonars (scanning or fixed) that can provide both ranges and intensities, or devices that would be called 'imaging sonars' but also provide a device internal range detection.

@ivaughn
Copy link

ivaughn commented Jan 9, 2019

How much detail are we looking to get into? Some multibeams (Kongsberg EM-series, Reson-7125) make it possible to get per-beam TX time, RX time, and steering angles. This can be especially important with deep water multibeam from the surface as the multibeam can roll/pitch quite a lot in the two seconds it takes to make the round-trip to 1500m. Multi-sector systems also have a time offset between transmitting different beams.

Also, virtually all of these higher-end multibeams support at least equi-distant or equi-angular beam spacing.

Does the broader community have a sense of how important it is for the standard multibeam message to support raytracing a new sound velocity profile?

@smaria
Copy link
Collaborator Author

smaria commented Jan 21, 2019

It sounds like the SonarBeam msg could be enhanced with more optional fields, giving the details for individual beams that @ivaughn suggested, which can be left empty if unused (did I miss any?):

time rx_time
time tx_time
float32 beam_angle  # angle of the sonar beam relative to the zero degree position of the sensor frame

A further suggestion for a field in the main message is range_intensities. This is based on the imagenex deltaT which can return either ranges, or ranges + intensity at that range, or no ranges but the full set of intensities for each beam.

With all the extra optional fields, the message becomes a bit overwhelming for new users, however I am unsure where to draw the lnie for a simpler message. A possible split would be removing the fields

  • float32 angle_min
  • float32 angle_max
  • float32 angle_increment
    and replacing SonarBeam[] intensities with a single array float32[] intensities. The resulting message would then apply to single beam and mechanical scanning ranging and beam intensities sonars. This is assuming the mechanical scanning sonar data is sent per beam, not as the image composed of multiple beams.

Rather than differentiating ranging and intensity data, this approach differentiates single and multiple beams. This principle could also be further extended by a third message type for sonars with beam positions rotated around two axis (though I don't know enough about those to know if this would indeed work).

Any other split suggestions, arguments for/against splitting, or extra field suggestions?

@ivaughn
Copy link

ivaughn commented Jan 22, 2019

Can I ask what our goal is for multibeam data? In particular, what's the role for a sonar message vs. a more general ROS pointcloud message?

As a rule, multibeam data is stored in one of three formats:

1). Beamformed RX and TX angles + travel time (usually two-way). This is generally not converted to a range because in deep water or when trying to meet International Hydrographic Office standards the conversion involves solving a PDE rather than simply dividing by sound velocity. This format is of interest here for two reasons: first, it's what survey-grade multibeams spit out, and second there are some folks trying to do IHO-style processing. I just can't get them to chime in (C'mon Val!). Reson's s7k datagrams don't actually include a range.

2). Processed point data is typically stored in a pointcloud, admittedly in a pretty confusing coordinate frame (usually local-level but with heading+lat/lon defined in the body frame). Without a compelling reason to do otherwise, I'd suggest we use pointclouds for multibeam data. With a sensible choice of instrument coordinate frame, this becomes a decent way to handle low-end planar multibeams.

Note that we have to use a full XYZ pointclouds because the points from yaw-compensating multibeams are not coplanar.

For details on how this has traditionally been handled, check out Leido's Generic Sensor Format (GSF -- see http://www3.mbari.org/products/mbsystem/formatdoc/gsf_lib.pdf) and MBsystem's native FBT format (http://www3.mbari.org/products/mbsystem/batheditintegration/index.html)

Conventional multibeam processing almost always uses indexed pointclouds. If your multibeam has 256 beams, every datagram should have 256 beams. Users, filtering algorithms, and the sonar itself can then flag beams as invalid.

3). Gridded data. This should ALSO be a thing, but will vary a fair bit based on whether people are using octomap, conventional grids, etc etc. Another topic for another issue.

@smaria
Copy link
Collaborator Author

smaria commented Jan 23, 2019

Interesting! The proposed message is based on the existing suggestions (which in turn both seem to be based on LaserScan), and my (limited) experience using mutlibeam data for obstacle avoidance and mapping. It sounds like we need more message options for multibeam data, depending on the further processing & use of the data.

It seems to me like the proposed message is useful, but cannot cover all use cases. I would suggest to open a new issue discussing further multibeam sonar messages (#9), and focusing this issue back on the suggested message, how/if it should be split and if it is missing any information. When to use existing message types and when to define our own is a general question, that we will likely run into repeatedly. I added some general notes to #4.

@pvazteixeira
Copy link

(apologies if this is not the right issue/place for this comment)

@ivaughn IMO the multibeam sonar message should be closer to what you're describing in (1) - data containing the intensity over time/range for each beam, as this would not prevent the user from running whatever processing is needed on the raw data. Additionally, a ROS driver for the multibeam could publish a separate range/point cloud message in parallel (something similar to LaserScan).

Regarding your point on reference frames - and maybe overlapping with #7 - I would go in the same direction as @smaria 's proposed message specification, where a multibeam message contains an array of single-beam Sonar messages, each carrying its own header (and reference frame). I would also suggest to change SonarBeam to be more "self-contained", so that it can be used with single-beam sonars (e.g. depth sounders or mechanically-scanned profiling sonars), as well as accommodating beams that are not coplanar and/or equally spaced.

# single beam sonar measurement
Header header

float32 azbw             # beamwidth in azimuth/y [rad]
float32 elbw             # beamwidth in elevation/z [rad]
float32 frequency        # [Hz]

float32 range_min        # minimum range value [m]
float32 range_max        # maximum range value [m]

float32[] range          # range data [m] (Note: values < range_min or > range_max should be discarded)
float32[] intensity      # intensity data [device-specific units].  If your
                         # device does not provide intensities, please leave
                         # the array empty.

Another advantage of moving some of the complexity to this single-beam message would be that you would potentially no longer need separate messages for profiling and imaging multibeams.
In the past, I have used a similar LCM specification with both imaging and profiling multibeams.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

5 participants