Monthly Archives: April 2014

Literature Inertia - Maintaining Stability or Discouraging Exploration?

Image: http://tctechcrunch2011.files.wordpress.com/

Image: http://tctechcrunch2011.files.wordpress.com/

Recently I've been thinking a lot about literature inertia and the best ways to accommodate and deal with it.  What is literature inertia? It is a phrase that a professor I had at Penn State used to describe the common theme in fields of research where things are done a certain way because that's the way they have always been done.  Everyone bases their analysis or technique on one "seminal" paper at some point in the past.  The methods in that paper are likely the first methods tried that succeeded, and everyone has used them ever since.

I can see some benefits to literature inertia.  For one, it provides a consistent way things are done or a "standard" analysis program that all scientists in the field use.  This kind of stability allows long term comparison and inter research group comparability.  That's fantastic! Maybe the method isn't exactly ideal, but it is the same everywhere and eliminates some of the variables that would otherwise be present.   Inertia of a field also means that the wheel isn't re-invented all of the time, which saves the researcher time and lets them pursue the research, not the methods.  But is that best for the advancement of science?

The downsides of literature inertia are just as significant as its advantages though.  The original methods or code that become the "standard" is likely one of the first that worked well when the research was in the discovery phase.  It is also, by necessity, a bit old.  There are likely better methods developed that could produce better results.  I also believe that the pressure to use a standard procedure is discouraging exploration.  Funding isn't commonly given to explore and test new ways of solving a solved problem! Literature inertia can also bias a field against an idea for decades.  There are some sub-disciplines that are considered to be very delicate research areas.  Working on these new and poorly understood areas runs the risk of having your career marked early as being a borderline crank.  Many reasonable ideas have been floated in these fields, but quickly shot down by those following the inertia.  Often these ideas are thrown out with little work done to legitimately check their validity.  Likewise, one true crank can make an entire area taboo for all researchers.

So what's the answer to this problem? Well, like so many things in science, it probably lies in the gray area in between.  While some stability is needed so that each researcher isn't approaching a problem from completely different directions, there should be less discouragement of exploration.  Standards are also temporary.  Nothing in research is truly permanent.  Standards will become out-dated and need replaced.  This process isn't easy, painless, or fun, but necessary if science is to remain current and relevant.

Computer data formats are one example I can think of to illustrate inertia. There are many great formats that will stick around for some time such as JSON, HDF5, NetCDF, etc.  Some labs still insist on having their own data format though! This is puzzling because the computer scientists have done a very good job of making a flexible data format that is supported by most major programming and scripting languages.  The labs using in-house formats must distribute readers (normally only in one or two languages) or share bulky text files to collaborate with others.  Why do these labs insist on their format? Because it is what they have been using for years and they don't want to invest the time and effort to change to a more open format.  Inertia, for those groups, is crippling their ability to use more recent tools.  That matters because if more tools are available to analyze data and they are easy to use, researchers will find it easier to explore their data.

Another example is inversion techniques commonly used to solve for things like earthquake location problems.  Some fields are using inversion techniques that came about in the 1950's.  These techniques work, in fact, they have been tuned over the years to work very well.  For operations on a day to day basis, that stability is important.  It is the job of researchers to try new techniques though and explore/improve.  Every technique has a weakness, and trying many is important!

I do think that many standard techniques will be challenged with a new group of researchers coming into the job market, but I am concerned about how going against literature inertia could damage long-term job prospects.  I've heard well respected traditional faculty say things like "This computer data management problem isn't a decision for you early career people or something you should be involved with."  Like wise I've also seen some excellent ideas get pushed out because it isn't the same way things have always been approached.  This attitude is likely propagated by the pressure to publish and the damping that puts on free exploration.  What do you think?

 

It's All About the Waves - 2014's First Magnitude 7+ Event in Chile

There's been a decent amount of chatter amongst Earth scientists that it has been a long time since the last magnitude 7 or greater earthquake.  In fact, there hadn't even been one in 2014 until last night.  The earthquake is currently rated an 8.2 (mww) and occurred in a well known seismic gap that has been published on a decent amount in recent years.  The last major earthquake in North Chile was an 8.6 in 1877! Many smaller earthquakes in the area over the last weeks have kept everyone on their toes.

This location in Chile marks a major plate boundary where the Nazca plate is subducting, or being pushed under the South American plate.  The idea of subduction is that the two plates are being forced together and one ends up getting pushed underneath the other.  In this case, the cold and dense oceanic crust gets pushed underneath the less dense continental crust.  As we would expect, this means that the earthquakes occur on a very shallow angle thrust.  Moment tensor solutions can tell us about the fault by analyzing many seismograms.  Turns out that the moment tensor solution looks like about a 12-18 degree dip on the fault, not out of line with our prediction.  There are a lot more of the advanced scientific products such as the moment rate function here.  It looks like the rupture lasted for around 100 seconds and slipped a maximum of 6.5 meters (21 ft.) at a depth of near 30 km (18.6 miles).  The earthquake started a little more shallow though, about 20 km (12 miles) down.

There have been many aftershocks with the event, some sizable.  At the bottom of the post I've provided a channel list that I'm using to watch the aftershock sequence on the EpiCentral app (for iPad).  What I want to show are the buoy data though!  When a large earthquake of this type occurs, waves are generated in the ocean and the folks at the Pacific Tsunami Warning Center go into action.  There were some significant waves near Chile (about 2m/6.5 ft.).  It looks like, for the time being, most other locations such as Hawaii may be in the clear.  As I'm writing this the remnants of the waves should reach Hawaii in the next few hours.  Below is a rough travel time map from NOAA.

ChileTsunamiTravelTime

A list of the observations from the  warning center can be found in their most recent statement.  We can actually access the buoy data and look at the wave propagating across the ocean though!

When waves propagate across the water (or many other media) they often experience a phenomena called dispersion.  The idea is that waves are actually made of many frequency components, or notes if you will.  Because of some physics funny business, the longer period (lower frequency) waves will actually travel faster than the short period (high frequency) waves.  We can see this in the data below.  I'm showing two stations for sealevel.  They have different types of sensors, but that's not too important.  Be sure to click on the plot to see it full size (the link will open in a new window/tab)!

Wave_dispersion

We see exactly what theory predicts, long period waves coming in first, followed by progressively shorter period waves.  We also see that stations further out don't see the high frequency waves.  This is another phenomena in which the medium filters out high frequency waves over the travel.  We would say that the high frequency waves have been strongly attenuated.

That's all for now! Thank you for sticking with me through some interesting observations of predictions from math and physics!

Channel List:
C;GO01; --;BHZ;VERTICAL,CHUSMIZA, CHILE
C;GO01; --;BHE;EAST-WEST, CHUSMIZA, CHILE
C;GO01; --;BHN;NORTH-SOUTH, CHUSMIZA, CHILE
IU;LVC; 00;BHZ;VERTICAL, LIMON VERDE, CHILE
IU;LVC; 00;BHE;EAST-WEST, LIMON VERDE, CHILE
IU;LVC; 00;BHN;NORTH-SOUTH, LIMON VERDE, CHILE
C;GO02; --;BHZ;VERTICAL,MINA GUANACO, CHILE
C;GO02; --;BHE;EAST-WEST, MINA GUANACO, CHILE
C;GO02; --;BHN;NORTH-SOUTH, MINA GUANACO, CHILE