Category Archives: Education Opportunity

Podcasts I'm Listening To

Image: http://www.urbandharma.org/images/podcasts011.jpg

I've had a few requests to list the podcasts that I'm listening to. In addition, Elecia and company over at Embedded.fm compiled a list recently as well, so I'm hopping on the bandwagon. If you haven't tried listening to the Don't Panic Geocast that I co-host, I would highly recommend it, but I'm biased! These shows are listed alphabetically and span a variety of topics. I will come back and periodically update this list as I discover new shows and mark discontinued shows as such. Please feel free to recommend any you think I have missed!

Build a Drone!

cover_servo_May2016

Things have been a bit slow here at the blog with a lot of things happening at work and the fact that I'm not also very excited to be writing a monthly column about multi-rotors in Servo Magazine! Not to worry though, I've still got some excellent projects queued up for the blog, including one that has currently filled my living room with sawdust.

In this post, I wanted to share with you a video of the drone I scratch built flying around and encourage you to follow along and build it as well! The entire project cost about $350 and produced a really nice and versatile platform that I'm going to be adding instruments to, as well as GPS, telemetry, etc. The May issue of Servo featured the monthly column introduction on the cover! That column talks about FAA rules and how to get registered. The following columns are going to go through building the drone, step by step. We'll start off with the airframe, then move on to adding electronics, setting up the flight controller, and finally flying under manual and computer control. We already have other articles planned that include reviewing commercially available quads, as well as hardware hacking them for new functionality.  If you like the blog, you might like to follow that series of articles as well!

How Thick is the Crust?

Earth's Structure (Wikipedia)

Earth's Structure (Wikipedia)

I think the first time I really heard much about the Earth's crust was on the TV show "Bill Nye the Science Guy." (In fact, I was obsessive about not missing an episode as a child and I was ecstatic when I got to see Bill speak at Penn State last year.) He talked about earthquakes and Earth's structure, cut in with funny segments of a family telling their son, "Ritchie, eat your crust."

The crust in an interesting thing - it's what we live on top of and there are lots of interesting places where it's different due to geologic processes that concentrate certain types of materials. The crust is broken up into around a dozen major tectonic plates that move at about 4-6"/year. These plates are either oceanic or continental crust. Oceanic crust is generally relatively thin ~6 km (4 miles) and oceanic crust is much thicker at ~35 km (22 miles). The thin oceanic crust is also more mafic and dense than the felsic continental crust.

These differences create complex interactions when the plates meet each other at plate boundaries. We did a whole show on plate tectonics over at the Don't Panic Geocast recently, so if you'd like to hear about the discovery and arguments over plate tectonics you should check it out.

Today, I'd like to share a tool and Dr. Charles Ammon and I have made to visualize a crust model and allow anyone to explore the crust. All you need is Google Earth! We used a model called Crust 1.0 by Laske et al. that has how thick the crust (broken up into a few divisions) is for  64,800 points on the Earth along with some other crustal properties. That's every one degree of latitude and longitude! They put a lot of work into making this model. Generally we would use a Fortran program to get values out of the model, but Dr. Ammon had an idea to visualize the data in a more intuitive way with Google Earth. Over the Thanksgiving holiday I wrote a Python utility to access the model values and then we wrote a simple script that generates a Google Earth KML file based on the model.

All you have to do is head over to the project's GitHub page and click the "Download ZIP" button. While you're waiting on the download you can scroll down and read all about the development, the model, and find activities to try. Next, open folder you downloaded (most operating systems will automatically unzip it for you now) there will be several files, but the only one you need is the CRUST_1.0.kmz file.

Screenshot 2016-03-12 07.29.51

As long as you have Google Earth installed, double click that file and you'll see the Earth appear covered in red dots. If you zoom out too far, they will disappear though!

Screenshot 2016-03-12 07.31.49

Each red dot is a location where the model has the average crustal properties like how fast P and S seismic waves can travel and the density. All of these are explained in more detail on the project webpage. You should also try some of the projects we have listed there! As a starter, let's look at oceanic and continental crust and verify my assertion about their 3x thickness difference.

Clicking out in the Atlantic Ocean (make sure you are not on the continental shelf) we see about 13 km thick crust (the top of the mantle number). The water depth is also handy to have on-hand sometimes.

Screenshot 2016-03-12 07.35.15

Clicking well onto the North American plate we see about 36 km thick crust. Next you should head over to mountainous regions and basins and see how the structure of the crust is different - why is that? Sorry, no homework answers here!

Screenshot 2016-03-12 07.35.26

This is a really fun way to learn about the crust and a good reference tool as well! There are flyers in the docs folder that you can print to use as teaching aids or handout to students! We had a lot of fun making this 1-day project and hope that you'll explore it and let us know what you think! A big thanks to the folks that did the massive amount of work making the model - we just made it visible in Google Earth! Everything is open-source as always.

Tracking Earthquakes Across the Globe - Travel Times

A few days ago I had the Epicentral+ app running on my iPad sitting on my desk and saw an event come on the screen. By looking at what stations were measuring ground movement first, second, third, etc. I could make a good guess at the event's location. Did you know that, with the data from a single seismic station, you can begin to guess the epicenter?

Generally earthquake locations are performed using many stations and algorithms that have been tweaked for years as we want to get ever more accurate locations. The USGS does this location for many events every day. It's fun to keep a live feed of the global seismic data up and look at the patterns. This is possible thanks to applications like "Earth Motion Monitor" and "Epicentral+", both products of Prof. Charles Ammon. They are worth installing and having a look. Prof. Ammon has seen the value in being able to watch signals for long periods of time: you begin to pick out patterns and get an intuitive feel for the response seen due to different events. While I don't have nearly the amount of insight possessed by experienced seismologists, I wanted to show you a quick and simple way to figure out about how far an event was from a given station. If you combine that with some geologic knowledge of where plate boundaries are, you can likely narrow down the region and earthquake type before anything comes out online.

The event I saw is a pretty small event, a magnitude 5.8 near the Fiji islands; it'll work for our purposes and not provide too much distraction. I've marked it with a white star on the map below (a Google Earth map with the USGS plate boundary file). This event occurred near the North New Hebrides trench, part of a slightly complex zone where the Australian plate is being pushed under, or subducted, beneath the Pacific plate.

Our earthquake in question marked with a white star near the North New Hebrides trench.

Our earthquake in question marked with a white star near the North New Hebrides trench.

Though the event was not huge, it was detected by many seismometers around the globe. In fact, there is a handy map of the stations with adequate signal automatically generated by IRIS. The contour lines on the map show distance from the event in degrees (more on that later).

Stations and their distance from the earthquake. (Image: IRIS)

Stations and their distance from the earthquake. (Image: IRIS)

I saved an image of assorted global seismic stations about an hour after the event occurred. You can see energy from the earthquake recorded on all stations, with some really nice large packets of surface waves (the largest waves on the plot).

raw_seismograms

We're actually interested in the first two signals though, the classic P and S waves. Let's take a closer look at the station in Pohakuloa, Hawaii. We can see the first arrival, the P-wave, then a few minutes later the S-wave. The P-wave (a compressional, basically a sound-wave) travels faster than the transverse S-wave, so they arrive at different times. We know the wave speeds with depth in the Earth, so by using the difference in time between these arrivals, we can come up with a rough distance to the event.

seismograms_annotated

A graph of distance vs. arrival times can tell us the whole story. I've made a simple version in which you can find the time we measured (7.25 minutes) on the x-axis, then lookup the distance on the y-axis. If we do this (marked in dashed black lines), we see that the distance should be about 50 degrees.

SmP_time_curve

That's not bad! I calculated the actual distance knowing the earthquake location and station location to be 49.6 degrees. The theoretical difference in travel time based on a simple Earth model is 7.14 minutes. The slight error is due to a complex real Earth, but mostly due to me picking a rough time on an iPad screen without really zooming in on the plot. The goal was to know about how far away the earthquake was from the station though, and we did that with no problem. Just from that information it was easy to tell that the event was in the Fiji region.

Distance is in degrees, which may seem a little strange. Since the Earth is a ball-like blob, defining distances across the surface is a little tricky when distances get large. It turns out to be more convenient to think of this distance as an angle made with the center of the earth. Take a look at the screenshot below. It's from a program called taup and shows the actual paths taken by the P and S waves through a cut-away of the Earth. I've marked the angle I'm talking about with the greek letter ∆. (We would formally say that this is the great circle arc distance in degrees. If you want to learn more about great circle arcs, you should checkout our two part podcast on map projections.)

taup_path_annotated

As scientists, we often look at a travel time plot a little differently. There are many different waves or "phases" that we are interested in, so plotting one line of  S-P wave arrival is rather limiting. Instead we plot a classic "travel time curve" where the arrival time after the event is plotted as a function of distance. I've reproduced one below (table of data plotted from C. Ammon).

travel_time_curve

We can make a plot like this from data too! Taking many stations, plotting them as a function of distance we get a plot like the one below. You can see curved and straight lines if you stand back and squint a little. Those are arrivals of different phases across the globe! Notice the lower curved line that matches the P-wave travel time above.

Notice the lines and curves made as different phases from the earthquake arrive across the globe. (Image: IRIS)

Notice the lines and curves made as different phases from the earthquake arrive across the globe. (Image: IRIS)

Like I mentioned, there are many different phases we can look at. To give you an idea of things a seismologist would look for, there is a version of the plot with a lot of the more complex phases marked on it below. I know it looks intimidating, but for this event, you'll see we really can't easily discern a lot of the phases. That's because this really isn't a huge event, but it's nice for us because that means the plot is easier to look at.

Arrival plot with phases marked. (Image: IRIS)

Arrival plot with phases marked. (Image: IRIS)

So there you have it, by remembering the rough travel time curves or posting one on your wall, you can quickly determine the approximate region an earthquake occurred in just by glancing at the seismograms!

Magnitude 7.1 Alaska Earthquake Visualizations

This morning there was a magnitude 7.1 earthquake beneath Alaska. Alaska is no stranger to earthquakes, and I'm not going to talk about the tectonics, but I wanted to share the ground motion videos I produced for the event. Also be sure to checkout the ground motion videos over at IRIS as well. At present no major damage or injury was reported. Though CNN did sensationalize the earthquake (as they always do):

Screenshot 2016-01-24 07.20.30

First a video from a nearby station, Homer, AK. About 8 mm maximum ground displacement with some pretty large ground accelerations.

The earthquake recorded in Australia. Not as exciting, but notice the packets of waves towards the end of the video, these are the surface waves that took the longer route around the globe compared to their earlier counter parts. (Called R1/R2 and G1/G2.)

Here's a central US station near where I grew up. Nice surface waves and a good example of what looks like the PcP phase (P-wave reflected off the outer core of the planet.) The PcP phase is at about 604 seconds, around 100 seconds after the P wave. In the figure below the movie, the approximate PcP path is red, the P path is black. Pretty neat!

Screenshot 2016-01-24 13.32.59

 

Squeezing Rocks with your Bare Hands

Our lab group. Photo: Chris Marone

Our lab demo group. Photo: Chris Marone

As frequent readers of the blog or listeners of the podcast will know, I really like doing outreach activities. It's one thing to do meaningful science, but another entirely to be able to share that science with the people that paid for it (taxpayers generally) and show them why what we do matters. Outreach is also a great way to get young people interested in STEAM (Science, Technology, Engineering, Art, Math). When anyone you are talking to, adult or child, gets a concept that they never understood before, the lightbulb going on is obvious and very rewarding.

Our lab group recently participated in two outreach events. I've shared about the demonstrations we commonly use before when talking about a local science fair. There are a few that probably deserve their own videos or posts, but I wanted to share one in particular that I improved upon greatly this year: Squeezing Rocks.

Awhile back I shared a video that explained how rocks are like springs. The normal demonstration we used was a granite block with strain gauges on it and a strip chart recorder... yes... with paper and pen. I thought showing lab visitors such an old piece of technology was a bit ironic after they had just heard about our lab being one of the most advanced in the world. Indeed when I started the paper feed, a few parents would chuckle at recognizing the equipment from decades ago. For the video I made an on-screen chart recorder with an Arduino. That was better, but I felt there had to be a better way yet. Young children didn't really understand graphs or time series yet. Other than making the line wiggle, they didn't really get the idea that it represented the rock deforming as they stepped on it or squeezed it.

I decided to go semi old-school with a giant analog meter to show how much the rock was deformed. I wanted to avoid a lot of analog electronics as they always get finicky to setup, so I elected to go with the solution on a chip route with a micro-controller and the HX711 load cell amplifier/digitizer. For the giant meter, I didn't think building an actual meter movement was very practical, but a servo and plexiglass setup should work.

A very early test of the meters shows it's 3D printed servo holder inside and the electronics trailing behind.

A very early test of the meters shows it's 3D printed servo holder inside and the electronics trailing behind.

Another thing I wanted to change was the rock we use for the demo. The large granite bar you stepped on was bulky and hard to transport. I also though squeezing with your hands would add to the effect. We had a small cube of granite about 2" on a side cut with a  water jet, then ground smooth. The machine shop milled out a 1/4" deep recess where I could epoxy the strain gauges.

Placing strain gauges under a magnifier with tweezers and epoxy.

Placing strain gauges under a magnifier with tweezers and epoxy.

Going into step-by-step build instructions is something I'm working on over at the project's Hack-a-Day page. I'm also getting the code and drawings together in a GitHub repository (slowly since it is job application time). Currently the instructions are lacking somewhat, but stay tuned. Checkout the video of the final product working below:

The demo was a great success. We debuted it at the AGU Exploration Station event. Penn State even wrote up a nice little article about our group. Parents and kids were amazed that they could deform the rock, and even more amazed when I told them that full scale on the meter was about 0.5µm of deformation. In other words they had compressed the rock about 1/40 the width of a single human hair.

A few lessons came out of this. Shipping an acrylic box is a bad idea. The meter was cracked on the side in return shipping. The damage is reparable, but I'm going to build a smaller (~12-18") unit with a wood frame and back and acrylic for the front panel. I also had a problem with parts breaking off the PCB in shipment. I wanted the electronics exposed for people to see, but maybe a clear case is best instead of open. I may try open one more time with a better case on it for transport. The final lesson was just how hard on equipment young kids can be. We had some enthusiastic rock squeezers, and by the end of the day the insulation on the wires to the rock was starting to crack. I'm still not sure what the best way to deal with this is, but I'm going to try a jacketed cable for starters.

Keep an eye on the project page for updates and if any big changes are made, you'll see them here on the blog as well. I'm still thinking of ways to improve this demo and a few others, but this was a giant step forward. Kids seeing a big "Rock Squeeze O Meter" was a real attention getter.

Hmm... As I'm writing this I'm thinking about a giant LED bar graph. It's easy to transport and kind of like those test your strength games at the fair... I think I better go parts shopping.

Debugging - Book Review

51babHMIFWL

I end up doing a lot of debugging, in fact every single day I'm debugging something. Some days it is software and scripts that I'm using for my PhD research, some days it is failed laboratory equipment, and some days it's working the problems out of a new instrument design. Growing up working on mechanical things really helped me develop a knack for isolating problems, but this is not knowledge that everyone has the occasion to develop. I'm always looking for ways to help people learn good debugging techniques. There's nothing like discovering, tracking down, and fixing a bug in something. Also, the more good debuggers there are in the world, the fewer hours are waisted fruitlessly guessing at problems.

I'd heard about a debugging book that was supposed to be good for any level of debugger, from engineer to manager to homeowner. I was a little suspicious since the is such a wide audience, but I found that my library had the book and checked it out; it is "Debugging: The 9 Indispensable Rules for Finding Even the Most Elusive Software and Hardware Problems" by David Agans. The book sat on my shelf for a few months while I was "too busy" to read it. Finally, I decided to tackle a chapter a day. The chapters are short and I can handle two weeks of following the book. Each morning when I got to work, I read the chapter for that day. One weekend I read an extra chapter or two because they were enjoyable.

I'm not going to ruin the book, but I am going to tell you the 9 rules (it's okay, they are also available in poster form on the book website).

  1. Understand the System
  2. Make It Fail
  3. Quit Thinking and Look
  4. Divide and Conquer
  5. Change One Thing at a Time
  6. Keep an Audit Trail
  7. Check the Plug
  8. Get a Fresh View
  9. If You Didn't Fix It, It Ain't Fixed

They seem simple, but think of the times you've tried to fix something and skipped around because you thought you knew better to have it come back to sting you. If you've done a lot of debugging, you can already see the value of this book.

The book contains a lot of "war stories" that tell tales of when a rule or several rules were the key in a debugging problem. My personal favorite was the story about a video conferencing system that seemed to randomly crash. Turns out the compression of the video had problems with certain patterns and when the author wore a plaid shirt to work and would test the system, it failed. He ended up sending photocopies of his shirt to the manufacturer of the chip. Fun stories like that made the book fun to read and show how you have to pay attention to everything when debugging.

The book has a slight hardware leaning, but has examples of software, hardware, and home appliances. I think that all experimentalists or engineers should read this early on in their education. It'll save hours of time and make you seem like a bug whisperer. Managers can learn from this too and see the need to provide proper time, tools, and support to engineering.

If you like the blog, you'll probably like this book or know someone that needs it for Christmas. I am not being paid to write this, I don't know the author or publisher, but wanted to share this find with the blog audience. Enjoy and leave any comments about resources or your own debugging issues!

Using Visual Mics in Geoscience

Image: TED Talk

Image: TED Talk

Last time I wrote up the basics of a tip sent in by Evan over at Agile Geoscience. This technology is very neat, if you haven't read that post first, please do and watch the TED talk. This post is going to be about how we could apply this to problems in geoscience. Some of these ideas are "low hanging fruit" that could be relatively easy to accomplish, others are in need of a few more PhD students to flesh them out. I'd love to work on it myself, but I keep hearing about this thing called graduation and think it sounds like a grand time. Maybe after graduation I can play with some of these in detail, maybe before I can just experiment around a bit.

In his email to me, Evan pointed out that this visual microphone work IS seismology of sorts. In seismology we look at the motion of the Earth with seismometers or geophones. If we have a lot of them and can look at the motion of the Earth in a lot of places over time, we can learn a lot about what it's like inside the Earth. This type of survey has been used to understand problems as big as the structure of the Earth and as small as finding artifacts or oil in shallow deposits. In (very) general terms we look at very low frequency waves for Earth structure problems with periods of a second to a few hundred seconds. For more near surface problems we may look at signals up to a few hundred cycles per second (Hz). Remember in the last post I said that we collect audio data at around 44,200 Hz? That's because as humans we are able to hear up to around 20,000 Hz. All of this is a lot higher frequency than we ever use in geoscience... I'm thinking that makes this technique somewhat easier to apply and maybe even able to use poor quality images.

So what could it be used for? Below are a few bullet points  of ideas. Please add to them in the comments or tear them apart. I agree with Evan that there is some great potential here.

  • Find/visualize/simulate stress and strain concentration in heterogeneous materials.
  • Extract modulus of rock from video of compression tests. Could be as simple as stepping on the rock.
  • Extend the model to add predicted failure and show expected strain right before failure.
  • Look at a sample from multiple camera views and combine for the full anisotropic properties. This smells of some modification of structure from motion techniques.
  • Characterize complicated machines stiffness/strain to correct for it when reducing experimental data without complex models for the machine.
  • Try prediction of building response during shaking.
  • What about perturbing bodies of water and modeling the wave-field?

With everything in science, engineering, and life, there are tradeoffs. What are the catches here? Well, the resolution is pretty good, but may not be good enough for the small differences in properties we sometimes deal with. In translating this over to work on seismic data I think a lot of algorithm changes would have to happen that may end up making it about the same utility as our first-principles approaches. A big limitation for earthquake science is what happens at large strains. The model looks at small strains/vibrations to model linear elastic behavior. That's like stretching a spring and letting it spring back (remember Hooke's Law?). Things get interesting in the non-linear part of deformation when we permanently deform things. Imagine that you stretch the spring above much further than it was designed to be. The nice linear-elastic behavior would go away and plastic deformation would start. You'd deform the spring and it wouldn't ever spring back the same way it was again. Eventually, as you keep stretching, the spring would break. The non-linear parts of deformation are really important to us in earthquake science for obvious reasons. For active seismic survey people, the small strain approximation isn't bad though.

Another issue I can imagine is combining video from different orientations to recover the full behavior of the material. I don't know all of the details of Abe's algorithm, but I think it would have problems with anisotropic materials. Those are materials that behave differently in different directions. Imagine a cube that can be easily squeezed on two opposing faces, but not easily squeezed on the others. Some rocks behave in such a way (layered rocks in particular). That's really important since they are also common rocks for hydrocarbon operations to target! Surrounding the sample area with different views (video or seismic) and using all of that information should do the job, but it's bound to be pretty tricky.

The last thing that strikes me is processing time. I don't think I've seen any quotes of how long the processing of the video clips took to recover the audio. While I don't think it's ludicrous, I think the short clips could conceivably take a few hours per every 10 seconds (this is a guess). For large or long duration geo experiments that could become an issue.

So what's the end story? Well, I think this is a technology that we haven't seen the last of. The techniques are only going to get better and processors faster to let us do more number crunching. I'm curious to watch this develop and try to apply it in some basic experiments and see what happens. What would you try this technique on? Leave it in the comments!

Listening to Sound with a Bag of Chips

mit-visual-microphone

Today I wanted to share some thoughts on some great new technology that is being actively developed and uses cameras to extract sound and physical property information from everyday objects. I had heard of the visual microphone work that Abe Davis and his cohorts at MIT were working with, but they've gone a step further recently. Before we jump in, I wanted to thank Evan Bianco of Agile Geoscience for pointing me in this direction and asking about what implications this could have for rock mechanics. If you like the things I post, be sure to checkout Agile's blog!

Abe's initial work on the "visual microphone" consisted of using high speed cameras and trying to recover sound in a room by videoing an object like a plant or bag of chips and analyzing the tiny movements of the object. Remember sound is a propagating pressure wave and makes things vibrate, just not necessarily that much. I remember thinking "wow, that's really neat, but I don't have a high speed camera and can't afford one". Then Abe figured out a way to do this with normal cameras by utilizing the fact that most digital cameras have a rolling shutter.

Most video cameras around shoot video at 24 or 32 frames (still photos) each second (frames per second or fps). Modern motion pictures are produced at 24 fps. If we change the photos very slowly and we just see flashing pictures, but around the 16 fps mark our mind begins to merge the images into a movie. This is an effect of persistence of vision and it's a fascinating rabbit hole to crawl down. (For example, did you know that old time film projectors flashed the same frame more than once? The rapid flashing helped us not see flickers, but still let the film run at a slower speed. Checkout the engineer guy's video about this.) In the old days of film, the images were exposed all at once. Every point of light in the scene simultaneously flooded through the camera and exposed the film. That's not how digital cameras work at all. Most digital cameras use a rolling shutter. This means the image acquisition system scans across the CCD sensor row by row (much like your TV scans to show an image, called raster scanning) and records the light hitting that sensor. This means for things moving really fast, we get some strange recordings! The camera isn't designed to capture things that move significantly in a single frame. This is a type of non-traditional signal aliasing. Regular aliasing is why wagon wheels in westerns sometimes look like they are spinning backwards. Have a look at the short clip below that shows an aircraft propeller spinning. If that's real I want off the plane!

But how does a rolling shutter help here? Well 24 fps just isn't fast enough to recover audio. A lot of audio recordings sample the air pressure at the microphone about 44,200 times per second. Abe uses the fact that the scanning of the sensor gives him more temporal resolution than that rate at which entire frames are being captured. The details of the algorithm can get complex, but the idea is to watch an object vibrate and by measuring the vibration estimate the sound pressure and back-out the sound.

Sure we've all seen spies reflect a laser off glass in a building to listen to conversations in the room, but this is using objects in the room and simple video! Early days of the technology required ideal lighting and loud sounds. Here's Abe screaming the words to "Mary had a little lamb" at a bag of potato chips for a test.... not exactly the world of James Bond.

Screenshot 2015-09-12 13.19.56

 

As the team improved the algorithm, they filmed the bag in a natural setting through a glass pane and recovered music. Finally they were able to recover music from a pair of earbuds laying on a table. The audio quality isn't perfect, but good enough that Shazam recognized the song!

The tricks that the group has been up to lately though are what is the most fascinating. By videoing objects that are being minority perturbed by external forces, they are able to model the object's physical properties and predict its behavior. Be sure to watch the TED talk below to see it in action. If you're short on time skip into about the 12 minute mark, but really just watch it all.

By extracting the modulus/stiffness of objects and how they respond to forces, the models they create are pretty lifelike simulations of what would happen if you exerted a force on the real thing. The movements don't have to be big. Just tapping the table with a wire figure on it or videoing trees in a gentle breeze is all it needs. This technology could let us create stunning effects in movies and maybe even be implemented in the lab.

I've got some ideas on some low hanging fruit that could be tried and some more advanced ideas as well. I'm going to talk about that next time (sorry Evan)! I want to make sure we all have time to watch the TED video and that I can expand my list a little more. I'm thinking along the lines of laboratory sensing technology and civil engineering modeling on the cheap... What ideas do you have? Expect another post in the very near future!

Drone Sounding Prototype

Again we have a short project post in-between the posts of the open science series (part 3 coming soon)! This time I want to share a fun little project involving cheap drones and an instrument pack that I designed on top of the Light Blue Bean module. The pack uses an HTU21D temperature/humidity sensor and a BME180 pressure sensor. I designed the board in the open-source PCB/EDA tool KiCAD. Should you want to reproduce the boards, the files to send off to a board house are available on a GitHub repository here.

I designed the pack to be a measurement device for a home, truck, or airplane of the weather enthusiast or storm chaser. Ideally it will send the data to a smartphone/tablet that then sends it out to the web or lets you do whatever you want with it. It was also a good excuse to play with the bean after hearing about it. While delivering another product to a friend, we decided to strap this sensor to a small and cheap ($33) drone and see what happened. We got some vague data, but the drone didn't get over a few meters high due to the high load. Zip ties provided some protection on takeoff/landing.

Our initial test flight with some quick plots in the background.

Our initial test flight with some quick plots in the background.

After playing we though it would be fun to do this on a drone with some more power. I grabbed a $55 drone (Syma X5C) on eBay and gave it a shot. After a couple of test flights I just couldn't get the bluetooth link to stay connected at the distances I wanted (50m).

My breakout and the bean attached to the top of the drone body.

My breakout and the bean attached to the top of the drone body.

I added a kludge that wrote data to an SD card using the OpenLog. It was extra weight since I needed two more coin cell batteries, but the drone turned out to be able to carry it to 45 m once or twice. Then the drone looses signal and shakily falls out of the sky until I can get control again. While inspiring me to drool over more advanced drones, I did get some interesting data! Some of the plots are rather small in web-view, but click on them to expand. I just didn't want a bunch of individual figures making the post scroll forever.

2015-07-27 19.17.44

First I'll show my first SD logged flight(s). Below is the altitude plot (derived from the barometric pressure sensor on-board).

A few up/down flights of the drone. The ascent in the grey box will be examined in detail.

A few up/down flights of the drone. The ascent in the grey box will be examined in detail.

If we take the highest and most constant climb rate ascent (gray box) and look at the temperature/dewpoint data we see rather clean results!

flight1

It was a dead still evening, just before sundown. Without any mechanical mixing,  we see radiation from the ground producing a temperature inversion (temperature increases with height here). We also see a nice dew point trend to drier air as we ascend. For fun, I calculated the lapse rate. This just means how fast the temperature changes with height. Plotting the data and fitting a line we get about +11 degrees/kilometer of height. A reasonable number. (Perhaps coincidentally about the negative of the typical dry adiabatic lapse rate? It's been too long and I didn't ever do much near ground meteorology. Thoughts appreciated.)

lapse_rate

The next evening, a very similar setup without wind, I did another sounding that got up to 45 meters. On this flight I noticed that the bumps in the temperature and dew point trends match rather well with the bump in my ascent rate. Since this drone isn't programmable, I do this by hand which is tricky to judge. It probably has to do with the sensors needing a lot of settling time to equilibrate to their surroundings (a couple of seconds). Maybe flying small circles on the way up is a solution. I also have the video from this flight if you're curious what it looks like. Nothing too interesting, but the uncontrolled descents are rather exciting. I've read about hacking better antennas on this drone for more range, so that's a thought. Before I get it much further away I want to do it in a large field to decrease the risk from a runaway drone. If this proves to be interesting enough, maybe a drone update will be in order. They are pricey though!


Flight 2 data

Flight 2 data