With this tool, users can gather new insights for the ongoing experiments on the space station.
Stereo Vision Experiment Review Tool

The solution created by the team, engages students in innovative, complementary learning opportunities, as well as increasing student interest in science, technology, engineering, and mathematics.

The Challenge: Spacebot Stereo Vision

SpaceBots on the International Space Station see in stereo vision. We have data to prove it – based on the NASA and MIT Zero Robotics programming competition that starts online with teams programming a space bot (called SPHERES) to solve problems onboard the International Space Station. An astronaut conducts the competition in zero g with a live broadcast for viewers back home on planet Earth. We want you to apply “stereo vision” data from space to help us see the world in new ways. Use the data output from this challenge to create a new point of view for how we see the world. You can create a data dashboard, visualization, infographic, app, software tools, or your own Earth-based robot.

The Stereo-Vision-Experiment Review Tool team used the stereo image data provided by the SPHERES micro satellites project to build an experiment review iPad app. NASA’s SPHERES program – Synchronized Position Hold, Engage, Reorient, Experimental Satellites -- consists of three free flying vehicles inside the International Space Station identifiable by their shell colors of Red, Blue, and Orange. Initially the SPHERES were designed for testing of control theory algorithms. The Satellites are about the size and mass of a bowling ball and use cold gas thrusters (CO2) to propel the vehicles around a fixed experimental volume.

Data for this solution comes from NASA's SPHERES program, which uses ultrasound beacons as a metrology system to identify its position in conjunction with accelerometers, and gyroscopes. The tool created by the team allows scientists to easily review YouTube videos captured from the space station experiments by facilitating browsing and visualizations. Users can bookmark and annotate significant areas of the timeline and browse adjacent time-slices interactively. The team created an API that links the annotated video feed and specific frame to visualizations.

made in

Glasgow, Scotland

from the mind of...
  • Tom Halfpenny
at Space Apps 2015

Related Solutions