Vail students use sensors and a robot to “Stop at Blue”

June 10, 2024

The educational team at Trailblazers Vail Blended Learning center invited me, Shravan Aras the University of Arizona Asst. Director at the Center for Biomedical Informatics & Biostatistics, to talk to their students as part of their career speaker series on 14th May 2024 at the Vail Innovation Center with the Vail Unified School District. Trailblazers focuses on a blended model of learning and teaching that uses the best aspects of computer-based learning and the power of small-group instructions to provide students with a challenging curriculum. Their career speaker series invites individuals from all walks of life to give their students an insight into various professions.

As I primarily work on wearable sensors, data analytics, and systems programming, I wanted to create small demos for the 11–14 year-old students that would be interesting to watch and would give a glimpse into the wonderful world of embedded systems and sensors which would include an introduction into data science and AI based algorithms. First, I showed the students a recent AI based research project that makes use of deep convolution networks to identify and count the distribution of sweat pores from thermal camera images. The premise with this was to get them excited in the field rather that focus on the complex details by showing them images or results that were easy to understand and to elevate their interest in the perpetually expanding world of data science.

Image
Vail students looking on

I then demoed an Arduino board to read data from various environmental sensors along with a fun game I called “Stop at Blue.”  These devices along with a Sphero RVR robot were graciously lent to me by the University of Arizona Sensorlab. The demo made use of various interfaces such as SPI (Serial Peripheral Interface) and I2C (two-wire interface) to pull data from temperature sensor, write content to a tiny OLED screen, 6-axis inertial sensors, ambient sound and optical intensity sensor. The demo allowed students to place their fingers on the optical sensor and observe the LEDs dim (based on light intensity) or speak close to the ambient sound sensors (or tap it for added effect) and watch as a bank of LEDs sequentially lit up based on the amplitude of the sound signal. The Arduino finale came in the form of the game, “Stop at Blue”, that I had programmed. The simple game consists of four LEDs in varying colors with one being blue. The code cycles through the LEDs lighting them up for a certain time, the speed of which was controlled by a potentiometer. The goal of the game was for the students to stop the LED sequence when it came to blue by pressing a button. If the students got it right, they could increase the speed and, in turn, the difficulty by rotating the potentiometer. This was tremendously fun and educational for the students with everyone wanting to test their reflexes with the game.

Image
Sphero robot and color paper

Sphero robot during the color paper demo

The final demo made use of the Sphero RVR robot which has the ability to be programmed in both JavaScript and a Scratch-like programming language. It housed a RGB color sensor on its underside, 2 stepper motors capable of some precise movement and tracking, along with inertial sensors. The students instantly connected to the ability of programming it via Scratch, as they had learned it in school (though the demo code I wrote was in JavaScript). The highlight of the Sphero demo was making it crawl (slowly turning its motors to move) over a set of colored papers laid out one after the other. It was programmed to read the color it crawled on, and if it was black, it would make a 180 degrees turn and head back to the starting point. As all good demos go, it failed in the first attempt. When the Sphero completely missed its queue to turn on the black paper, it kept going. I found this to be a rather fun opportunity to show the students how to debug when things go wrong. It turned out that the demo room was much darker than the room I had initially calibrated the color sensor on. After some quick recalibration code, the Sphero was back on its way turning correctly on the black paper and returning. The final demo made use of the light sensor on top of Sphero. It was programmed to “run-away” whenever the students tried to hover their hands on the sensor, thus reducing the light intensity.

While basic, the demos and the presentation gave the students an exciting glimpse into the world of data science, environmental sensors, and embedded systems programming. The students left with large smiles on their faces as the bell rang, with some hanging behind to ask questions. Their genuine urge to learn, excitement, and energy made this a memorable day. I would like to thank all the teachers at the Vail Innovation Center with a special shoutout to Mrs. Lotti (English/Social Studies) and Mrs. Husfelt (Special Education) for providing me the opportunity to interact with those brilliant students’ minds.

###