You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Develop and document a location proof recipe for a sensor data strategy. This approach relies on devices or systems collecting sensor data (e.g., images, GPS coordinates, accelerometer readings, images, sounds) to infer and verify location. Techniques might include geospatial AI extraction from images, analyzing variation in GPS coordinates, or other sensor-based evidence. The goal is to establish a clear and reusable pattern, with code and documentation, that can be attached to location proofs created based on the generalized Location Proof Protocol.
Purpose
Enable location proofs based on sensor data analysis (e.g., GPS, images, accelerometer data).
Provide developers with a pattern and implementation for integrating this strategy into applications.
Extend the Location Proof Protocol’s flexibility while maintaining adherence to the standardized data model.
Requirements
0. Research Prior Art
Research and document existing techniques for inferring location from sensor data.
Examples: GPS signal variation analysis, object recognition from images, accelerometer-based location inference.
1. Define the sensor data strategy
Determine how sensor data will be collected and processed to generate and verify location proofs:
Examples:
Geospatial AI extraction from images (e.g., recognizing landmarks, detecting environmental features).
Variation in GPS readings analyzed for plausibility.
Sensor fusion (combining GPS, accelerometer, magnetometer, etc., for enhanced accuracy).
Consider privacy, scalability, and the reliability of the chosen sensor types.
Define the inputs and outputs for this recipe:
Input: Sensor data (e.g., GPS logs, image files, accelerometer readings).
Output: Verifiable data encoded in the proof schema as an element in the recipePayloadbytes array.
2. Implement the Recipe
Implement the code required to process sensor data and generate location proofs with this recipe data attached.
Implement code to verify location proofs with recipe data attached in a client or server environment.
(Ideally) implement a smart contract function to verify location proofs with this recipe data attached (onchain verification).
3. Documentation
Provide a detailed walkthrough in the repository:
Explain the strategy and how it works (inputs, outputs, logic).
Include setup instructions for testing or integrating this recipe.
Example proof data for developers to reference.
This should be submitted as a pull request to this repo (astralprotocol), in a directory here. See this for an example, untested(!) recipe. (This is far from defined, we want help refining exactly how proof recipes are documented.)
Performance under typical and high-load conditions.
Provide a simple front-end demo or CLI example for local testing.
5. Encourage Developer Creativity
While the base implementation can use GPS, images, or accelerometer data, you can propose alternative approaches — we haven't figured this out, it's innovation work!
Acceptance Criteria
A description of the recipe, data collected, how it's created, plus code for creating and verifying proofs using a sensor data strategy, submitted to this repository as a pull request.
Comprehensive tests and documentation for creating and verifying location proofs using this strategy.
Tasks
Design and define the strategy, including key interactions and schema mappings.
Implement the recipe creation + validation scripts (preferably in Typescript or Python, plus optionally a verification function in Solidity).
Write unit tests to validate the recipe’s functionality.
Add documentation with clear examples for using the recipe.
Create a simple demo (e.g., CLI or web-based) to showcase the strategy.
Submit for peer review and incorporate feedback.
If you have questions or ideas for alternative approaches, feel free to comment or raise a related issue in the repository.
The text was updated successfully, but these errors were encountered:
Summary
Develop and document a location proof recipe for a sensor data strategy. This approach relies on devices or systems collecting sensor data (e.g., images, GPS coordinates, accelerometer readings, images, sounds) to infer and verify location. Techniques might include geospatial AI extraction from images, analyzing variation in GPS coordinates, or other sensor-based evidence. The goal is to establish a clear and reusable pattern, with code and documentation, that can be attached to location proofs created based on the generalized Location Proof Protocol.
Purpose
Requirements
0. Research Prior Art
1. Define the sensor data strategy
recipePayload
bytes
array.2. Implement the Recipe
3. Documentation
4. Testing
5. Encourage Developer Creativity
Acceptance Criteria
Tasks
If you have questions or ideas for alternative approaches, feel free to comment or raise a related issue in the repository.
The text was updated successfully, but these errors were encountered: