Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Specify requirements and criteria for publishing test summaries and reports #5

Open
csarven opened this issue May 18, 2022 · 11 comments
Assignees
Labels
documentation Improvements or additions to documentation

Comments

@csarven
Copy link
Member

csarven commented May 18, 2022

This issue is an ACTION of https://github.com/solid/test-suite-panel/blob/main/meetings/2022-05-13.md#proposal-1-specify-requirements-and-criteria-for-publishing-test-summaries-and-reports due 2022-05-27. To document a mutual understanding (see #considerations) towards the Test Suite Panel Charter proposal.

Related issues:

Considerations:

  1. What information should test reports and summaries include?
  2. What format and data models should the reports use?
  3. How to distinguish between approved and unapproved tests/results in reports?
  4. Where should reports be published?
  5. How can reports be submitted for publication?
  6. What level of consent is required from implementers to publish reports?
  7. Who authorises the publication of reports?
  • ...
@csarven
Copy link
Member Author

csarven commented May 18, 2022

Status of this comment: Draft

  1. Test reports and summaries to include:
    • Document metadata (e.g., license, date, report generated by, submitted by, approved by, notes from the maintainer).
    • Description of the project (implementation) that's tested (e.g., repository, version, maintainers).
    • Reference to test metadata (e.g., test authors and reviewers, test review status, version of the test, setup, provenance, coverage) and test suite. Some information can be incorporated in test assertion info (see next point).
    • Test assertion (with URI) providing the URI of the test that was run (with description referring to the requirement URI), asserter (URI), subject of the test (URI), the mode in which the test was performed (URI), test result (URIs and descriptions); outcome and additional information about the test, result, and notes from the maintainer.
    • Approved test reports must not include assertions of tests with rejected review status.
    • The composition of the summary document is TBD.
  2. Human-visible (HTML table) and machine-readable (RDF(a)). Specific details on the vocabularies that's to be used will be documented later, but DOAP, EARL, Test Description, DC Terms, PROV-O are good candidates.
  3. Each test's assertion info must indicate the review status of the test (at the time when the test was run). The report must also indicate the status of the publication as well as other visual cues.
  4. Specifications link to approved implementation reports summary/index, e.g., https://solidproject.org/test-reports/{technical-report-short-name}/summary where it will link to individual implementation reports e.g., https://solidproject.org/test-reports/{technical-report-short-name}/{uuid}.
  5. GitHub repo/directory and/or report sent as an LDN (TBD: either including or requesting maintainer's approval.)
  6. Project maintainer will be notified to review the publication of a report, and if no objection within a certain time (TBD), it can be approved for publication by the submitter (or test suite panel). During the review process, maintainers will be given the opportunity to provide explanatory notes to go with the report.
  7. Publication of reports can be pre-authorized for approved tests.

@edwardsph
Copy link
Contributor

This all looks good to me, with a couple of comments:
4. What is the link to? Approved tests or reports? It seems like it is referring to reports but then mentions tests. Can you clarify please?
6. 2nd sentence incomplete.

Ref 6 & 7: Whilst we work towards a goal of full test coverage by approved tests, we will have a mix of approved and unreviewed tests. The results of all these tests are of value to maintainers since unreviewed tests may have valid results and will often relate to newer parts of the spec. This can help maintainers confirm that they (and the tests) are headed in the right direction.

So, just to clarify, you are proposing that any CG-published report should only contain the results of approved tests? I see the benefit of that as it means the reports should be fully trusted and will avoid any mis-interpretation. On the other hand, I think it is important that full test results are available to maintainers. In point 7 you suggest that reports with results from unapproved tests can be published anywhere - can we discuss the potential audience for this? If it is maintainers, then they don't need publishing as such, they just need to be made available to them. If not maintainers, then who is going to benefit from publishing results that are less authoritative. Won't it be confusing to have reports in multiple locations?
When we catch up and get all existing tests approved, then all results would be CG-published so that makes any other publishing locations redundant. I think we need to be very clear about the purpose of publishing results elsewhere.

@michielbdejong
Copy link

There are two sides to this:

  1. What we (the Solid test suite panel) publish on solidservers.org, and our commitments there, e.g. include all necessary remarks and caveats, make it machine readable, have a process in place for how we update it, etc.
  2. What we can do to eventually get our reports accepted for publishing on solidproject.org, e.g. get 100% of specification-tests approved by spec editors, get jest-based tests phased out, etc.

@michielbdejong
Copy link

I think on solidssevers.org we should drop all occurrences of the word 'independent', to avoid any implications that the specification-tests are anything less than independent.

@edwardsph and I also discussed a construction where we don't publish the output of the specification-tests directly, to avoid any implications that the specification-tests are "done". For the jest-based tests we can publish the full machine readable output though, and if there is data we discover through running the specification-tests in private, we could always write a jest-based test that replicates those findings.

So then basically the jest-based tests report on solidservers.org, and the machine-readable output of the specification-tests will not be published for the time being, but will one day (when they're finished and the spec is as well) be published on solidproject.org

@michielbdejong
Copy link

Project maintainer must approve the publication of reports of approved tests.

For publication at https://solidproject.org/test-reports/{technical-report-short-name}/ we (or the webmasters of solidproject.org) can use that rule, but for https://solidservers.org we should continue to follow https://github.com/solid/process#stakeholders in deciding what to publish. We could for instance have a process where server implementers get 2 weeks to comment on new test failures for their servers before we publish them.

@csarven can you please try to propose a different text for point 6? It also reads like the second sentence is incomplete.

@michielbdejong
Copy link

@csarven thanks for editing it, but I think there still is some major room for improvement. Let's have a video call to discuss the details, see Gitter DM

@michielbdejong
Copy link

Thanks for your time @csarven - as discussed, from my PoV, what we want above all is "one panel, one charter", and I think we can achieve that. We just need to write down the way we work, in a charter that covers both our current reporting while Solid is still in development, and our goals for what the test suite should look like in the future.

  1. Project maintainer will be notified ahead of publication of reports including unreviewed or onhold status. In case of a conflict of interests, we adhere to https://github.com/solid/process#stakeholders and https://en.wikipedia.org/wiki/Coordinated_vulnerability_disclosure .

@michielbdejong
Copy link

Yes! Thanks for updating point 6 @csarven, LGTM.

@michielbdejong
Copy link

Regarding "if no objection within a certain time (TBD), it can be approved for publication by the submitter (or test suite panel)" ... "maintainers will be given the opportunity to provide explanatory notes to go with the report" in point 6, that implicitly suggests that in case of objections, adding these explanatory notes can be a way to resolve them. Maybe we can additionally mention the priority of constituents, just to make it clear that all of this is "within reason", and maintainers cannot just object without providing any reason or explanatory notes for the cause of their objection.

@michielbdejong
Copy link

michielbdejong commented Jan 17, 2023

And:

  1. Human-visible (HTML table) and (in the future) machine-readable (RDF(a)).

because the report on https://solidservers.org is currently not machine-readable.

@michielbdejong
Copy link

michielbdejong commented Jan 17, 2023

Proposal for an additional point: if there is a spec dispute, then we accept all interpretations of the spec as "pass"

@csarven csarven self-assigned this Jan 17, 2023
@csarven csarven added the documentation Improvements or additions to documentation label Jan 17, 2023
@csarven csarven moved this from Todo to In Progress in <https://csarven.ca/#i> foaf:interest Jan 17, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
documentation Improvements or additions to documentation
Projects
None yet
Development

No branches or pull requests

3 participants