This document helps you to contribute stories, i.e., test reports or leaderboards, for Ianvs. If you follow this guide and find some problem, it is appreciated to submit an issue to update this file.
Everyone is welcome to submit and share your own test report to the community.
Ianvs is managed with git, and to develop locally you
will need to install git
.
You can check if git
is already on your system and properly installed with
the following command:
git --version
Clone the Ianvs
repo.:
git clone http://github.com/kubeedge/ianvs.git
Please follow the Ianvs setup to install Ianvs, and then run your own algorithm to output test reports.
You may want to compare your testing result and those results on the leaderboard.
Test reports are welcome after benchmarking. It can be submitted here for further review.
Leaderboards, i.e., rankings of the test object, are public for everyone to visit. Example:leaderboard.
Except for Ianvs Owners, there are mainly two roles for a leaderboard publication:
- Developer: submit the test object for benchmarking, including but not limited to materials like algorithm, test case following Ianvs settings, and interfaces.
- Maintainer: testing materials provided by developers and releasing the updated leaderboard to the public.
For potential developers,
- Develop your algorithm with ianvs and choose the algorithm to submit.
- Make sure the submitted test object runs properly under the latest version of Ianvs before submission. Maintainers are not responsible to debug for the submitted objects.
- Do NOT need to submit the new leaderboard. Maintainers are responsible to make the test environment consistent for all test objects under the same leaderboard and execute the test object to generate a new leaderboard.
- If the test object is ready, you are welcome to contact Ianvs Owners. Ianvs owners will connect you and maintainers, in order to receive your test object. Note that when developers submit the test object, developers give maintainers the right to test them.
For potential maintainers,
- To maintain the consistency of test environments and test objects, the leaderboard submission is at present calling for acknowledged organizations to apply in charge. Please contact
- Maintainers should be responsible for the result submitted.
- Maintainers should update the leaderboard in a monthly manner.
- Maintainers are NOT allowed to use the test object in purpose out of Ianvs benchmarking without formal authorization from developers.
- Besides submitted objects, maintainers are suggested to test objects released in KubeEdge SIG AI or other classic solutions released in public.