Let Down Your Hair

Rapunzel. She had long hair. Really long. So long, that although locked in a room at the top of a stairless tower (by the pettiest neighbor-lady ever), she could still entertain the occasional visitor by dropping her hair out the window for them to climb (one assumes hand-over-hand, gym class style). Both in classic stories and modern adaptations like Disney’s Tangled, Rapunzel’s exceptionally long hair is taken as a given. Nobody says, “Hey, how does this woman even have hair that is as long as the height of a prison tower? Everyone knows an ambitious high school junior whose hair falls past her butt, but I’ve never seen anything like Rapunzel’s mane on a real person’s head. Not even close. Is there any way this situation even makes sense?”

Enter Ida Pederson. Ida is an engineer who lives in Sweden. She writes a blog with exhaustive information about healthy hair, growing your hair really long, and taking care of long hair. Part of the story is that she let her hair grow for ten years without cutting it, and luckily for us, took many selfies along the way.

From Ida’s photo documentation, we can glean many data points. For many moments along her decade-long journey, we can see and record the length of her hair. Plotting all these points (length of hair vs month) in the first quadrant allows us to see the linear trend of her hair’s growth, and estimate or compute a best-fit line representing the length of her hair in inches in terms of how many months it had been growing.

The slope of this line, helpfully, tells us the overall rate of Ida’s hair growth in inches per month. Which turns out to be about half an inch. Another way to estimate her hair growth rate is to observe that after ten years, Ida’s hair was around around 60 inches long, and 60 inches divided by 120 months gives us 0.5 inch per month.

By way of comparison, this part of the trailer for Tangled gives a sense of how long Disney animators imagined Rapunzel’s hair to be.

Let’s call it 40 feet long.

To reach a length of 40’ would, at a rate of ½” per month, require 80 years of hair growing. However, the film puts her age at 18. So either Rapunzel is lying on her driver’s license, by a lot, or her hair grows at an insane clip. But just how insane?

Let’s assume that Rapunzel’s hair, throughout her life, obeys the rudimentary characteristics of human hair: it started out nonexistent or very short when she was a baby, it grows at a more-or-less steady rate, and it would only get shorter if someone were to cut it. Let’s further assume it had never been cut before the time we meet the maiden in the tower.

Using our simplifying assumptions, that would mean in order for her hair to grow 40’ in 18 years, it would have to grow at a rate of around 2¼ feet a year, or around 2¼ inches a month (since there are as many months in a year as inches in a foot).

The implication is clear: enchanted, fairy-tale hair.

Math teachers: want to have this conversation with your students? Check out this week’s featured lesson, Let Down Your Hair.

Siren Song

Some strange things are so common that it's easy to forget how strange they are.  Like, for instance, standing by the side of the road.  Each time you hear a car moving toward you, you know it will sound different after whizzing by.

It happens all the time, but if you stop to think about it, it's pretty weird.  You would definitely notice if the car changed color or something on the way by,1 but the fact that it changes pitch seems normal in our everyday experience, so maybe you've never tried to work out why it happens.  Let's do that.

Sound travels in waves, so we can think in terms of peaks and troughs.  One important aspect of a sound wave is its frequency, how many peaks or troughs an observer experiences per unit of time.  Frequency is typically measured in hertz (Hz), where the unit of time is one second.  So someone listening to a 400 Hz tone would be experiencing 400 wave peaks every second.  The frequency also determines the wave's period, or how long it takes to complete a single peak-to-peak cycle.  In this case, that would be 1/400th of a second.  In other words, period is the reciprocal of frequency.

You can also describe a wave in terms of its wavelength, which is the distance between adjacent peaks or troughs.  Clearly frequency and wavelength must be related: the shorter the distance between peaks, the more frequently an observer would experience one, and vice versa.

The speed of sound is about 340 m/s.  We can use that information to write down the precise relationship between frequency and wavelength.  Check that you agree:

f = \frac{340}{\lambda} \text{  or  } \lambda=\frac{340}{f}

Let's say we have a beacon emitting sound, and two observers who are each 340 m away.  Change the frequency below to see how the associated wavelength changes.  You can also see how a change in frequency affects the sound an observer will hear.

As long as everything is stationary, everyone agrees on the frequency the beacon is emitting, but something interesting happens once things start moving.  Here the beacon moves at half the speed of sound.2

Notice that, even though the beacon's frequency hasn't changed, the two observers will no longer agree on that fact.  The forward observer is getting a shorter wavelength/higher frequency, and the rear observer is getting a longer wavelength/lower frequency.  That means the sound seems higher to the person in front, and lower to the person in back!  In fact, the person in front will experience a change in frequency (from high to low) as the beacon moves past.  This phenomenon is known as the Doppler Effect, and it explains why the car's horn in the video seemingly dropped in pitch as it passed the camera.

That's certainly interesting, but we can go further.  We can calculate the apparent frequency ahead of or behind the beacon based on its actual frequency and how fast it's moving.

Notice that, for a stationary beacon, the distance between peaks is just the wavelength.  We already knew that.   But with a moving beacon, the distance between peaks changes by whatever distance the beacon covers in a single period.  To the front observer, the distance would decrease by that amount; to the rear observer, it would increase by that amount.  Here the beacon is still moving at half the speed of sound:

So we can work out the observed wavelength each person would report for a beacon moving at speed s:

\text{Front Observer:   } \lambda_F = \frac{340 - s}{f}

\text{Rear Observer:   } \lambda_R= \frac{340 + s}{f}

So we know what wavelength each observer would report, and we already have a relationship between wavelength and frequency, so with some substitution and a little rejiggering we can write down a general rule for the observed frequencies our two recorders would report, based on the observed wavelengths and the actual frequency and speed of the beacon:

\text{Front Observer:   } f_F = \frac{340}{340-s}f

\text{Rear Observer:   } f_R= \frac{340}{340+s}f

Play around and see what the observers would hear for a 500 Hz tone from a beacon moving at a few interesting speeds.  What would it sound like if the beacon were moving at the speed of sound?  And (extra cool) what about twice the speed of sound?3

But if a car is coming at you that fast, do us all a favor and step out of the way.  Live to listen another day.

Teachers, want to have this conversation in your classroom?  Check out our lesson, Siren Song.

1.  Okay, actually the car does change color on the way by.  As we'll see, the Doppler Effect depends on the speed of an object relative to the speed of the waves it emits.  Cars can move at an appreciable fraction of the speed of sound, so we can sense the Doppler Effect with our ears.  Even the fastest cars move at a negligible speed relative to light, so the color change is too small to detect.

2.  Obviously we've slowed things down quite a bit for the visualization so you can see what's going on, but notice the beacon is moving at half the speed of the waves it emits.

3.  SPOILER ALERT: The sound would appear to have the correct frequency, but the waves arrive in reverse order.  If a jet flying at Mach 2 were blaring some music, you would hear it in the correct pitch, but backwards!  Briefly.


Every so often, while kicking back in front of our 52" flatscreens, we're reminded that television used to be different.  Not just more wholesome, but different.  For instance, take a look at these clips from ESPN's flagship show SportsCenter.

The TV on the left shows the intro to the very first episode, while the one on the right is more recent.  Why doesn't the first clip take up the whole screen?  What's up with those bars running along the sides?  Why is that necktie so awesome?  It all has to do with ratios (well, except for that last one).

Actually, it has to do with Thomas Edison.  In the Long-Ago Times (prior to ca. 1975), if you wanted to record video, you needed strips of light-sensitive material known as "film."  You loaded this film into the camera, which had some sprockets that would latch onto perforations in the film and pull it through the camera, exposing frame after frame as it wound its way through.


The physical dimensions of the camera determined its film size, but different cameras required different sizes.  Edison and his buddy William Dickson decided that this was pretty dumb, and that film sizes should be standardized.  In 1892 they were using 35mm film stock; any image recorded on that film had a width:height ratio of approximately 4:3.  This measurement is known as the film's aspect ratio.  Because everybody generally agreed that Edison was a genius, 4:3 quickly became the industry standard.  In fact, even today 4:3 is often referred to as "standard format" for video.

For a while, movies continued to be shot in 4:3; television followed suit, so watching a movie on a television screen was no big deal.  But then movies got wider: eventually, we got the 16:9 "widescreen" format and the 21:9 "cinematic" format.  Currently, 16:9 is the conventional format for HDTV, so newer shows are often filmed in this ratio.

Which brings us back to SportsCenter.  That first episode was shot when the 4:3 ruled the screen, but nowadays 16:9 reigns supreme.  And here's where we run into a problem: you can't just scale old 4:3 content to fit onto a 16:9 screen, because the ratios are different.  No amount of scaling can change the ratio.1  Here's an interactive that will let you compare an image on both a 4:3 and a 16:9 screen:

Is there anything we can do to have the image fill the frame in both screens?  Well, yes, sort of.  You could scale the width and height by different amounts.  For instance, to get a 4:3 image to cover a 16:9 screen, you could scale the width by a factor of 4 and the height by a factor of 3.  In fact, if your TV has a "Zoom Stretch" option, that's exactly what it does.  Are you sitting by your TV right now?  How does "Zoom Stretch" look?  If you answered "terrible," you are correct.  The image gets distorted, which isn't such a huge deal for rectangles, but it's pretty rough on things like human faces.

Another option would be to scale the image uniformly until it's as big as possible, without losing any of the original content, and then fill in the leftover space with black bars.  This is what's happening in our SportsCenter example, and it's called "Letterboxing."2  The bars are a bit annoying, and you waste a lot of screen area, but at least you can see everything.

When you're trying to go the other way, say watching a 16:9 movie on an old 4:3 TV screen, you can also zoom so that the image takes up the whole screen, which lops off a bunch of the content.  To compensate for the lost image when zooming, you can pan along the cropped image to show more of what's going on.  This technique is called "pan and scan," and was pretty common in the era of video tapes made for home viewing.  Directors hated pan and scan, because it effectively meant somebody edited the film to create a fundamentally different cinematic experience.  Here are some bigwigs on the subject:

Next time you're watching Lucy reruns on your widescreen, or Lawrence of Arabia on your old 13" set, think about how what you're seeing reflects some interesting decisions in entertainment history.  Think about the ways your technology tries to compensate for those incompatibilities.  And think about Thomas Edison.

Teachers, want to have this conversation in class?  Check out our lesson, Letterboxing!

1. No amount of isotropic scaling (i.e. scaling uniformly in all directions) will affect the ratio.

2. Presumably because the horizontal bars resemble mail slots. Technically, when the bars are vertical it's called "pillarboxing," but "letterboxing" is more common.

Pounding Headache

Pharmaceutical companies take the motto "no pain, no gain" quite literally; without pain to medicate, they'd lose out on billions of dollars in the United States alone.  But how does your body to respond to this sort of medication...and how much can you safely take?

For simplicity, let's consider one of the most popular over-the-counter pain relievers: Tylenol.  When you take a couple of Tylenol tablets, the active ingredient (acetaminophen) is absorbed into your system and then gradually dissipates.  Though individual responses may vary, typically acetaminophen has a half-life of around 3 hours once it's working its magic: this means that if you take two regular strength tablets (650 mg acetaminophen), you'll have 325 mg in your system after 3 hours, 162.5 mg after 6 hours, and so on.  In other words, if you take an initial amount A0, the amount of acetaminophen in your system after t hours will be A0 × (1/2)t/3.

Here's an interactive that lets you play around with these parameters to see how much acetaminophen will be in your system over time.  The red line lets you model the point after which the acetaminophen stops having a therapeutic effect, because there isn't enough in your system.  You can use the graph to confirm recommendations on Tylenol bottles; for instance, with a half life of 3 hours, two regular strength Tylenol tablets (650 mg acetaminophen) will dip below a 200 mg threshold after about 5 hours, while it's recommended that you take two tablets every 4-6 hours.

But if a little bit of Tylenol is therapeutic, a lot must be even more so, right?  Well, not quite.  While your pounding head may love a dose of acetaminophen, your liver is less of a fan, and too much acetaminophen can cause it to shut down.  Also, unlike other pain relief drugs, the difference between a healthy dose of acetaminophen and a dangerous dose can be as small as two extra-strength Tylenol tablets (1000 mg acetaminophen).

Suppose you've got a killer headache, and take two extra-strength Tylenol tablets every 6 hours for an entire day.  Whether or not this is dangerous depends on your individual tolerance, as well as whether or not you're taking any other drugs containing acetaminophen.  A conservative estimate for the danger zone is 4000 mg; above that, your liver may not be able to handle everything you've given it.  Using the model above, it may seem like taking two tablets every 6 hours is fine: as the graph below shows, the total amount of acetaminophen in your body never exceeds around 1331 mg during that 24-hour period:

However, it's not the amount at any one point in time that matters; it's the cumulative effect that can be damaging.  And in fact, it can be damaging in ways that, until recently, weren't very well known.  (This American Life did a great story on Tylenol and acetaminophen last year.)  As a matter of fact, a few years ago the makers of Tylenol dropped the maximum daily recommended dose from 4000 mg of acetaminophen to 3000 mg; that's 6 extra strength tablets per day.  So taking two tablets every 24 hours means taking just two tablets over the recommended dose.  For some people, that's fine.  But it's probably not worth pushing your luck.

Teachers: want to have this conversation with your students?  Then check out our latest lesson, Pounding Headache.

Out of Left Field

Major League Hall-of-Famer Branch Rickey once said, "When I hit a home run I usually didn't care where it went.  So long as it was a home run was all that mattered."

While that's a totally reasonable philosophy — after all, you don't earn extra runs based on distance or anything — if you want to hit home runs in the first place, where the ball goes actually matters quite a lot.  It also matters where you happen to be standing when you swing the bat.

Baseball is a little strange, as team sports go.  If you're a basketball fan, you'll remember that great scene in Hoosiers where Gene Hackman's scrappy underdogs are wandering around the enormous gym where they're about to play for the state championship, and they're looking all kinds of nervous.  He pulls out a measuring tape to check the height of the basket and says, "Ten feet.  I think you'll find it's the exact same measurements as our gym back in Hickory."  In other words, any basketball player, stepping onto any court, knows exactly what the playing field looks like.  But that's not the case in baseball: not all ballparks are created equal.

In fact, many ballparks aren't even created symmetrical.  So Branch wasn't being completely honest about his preferences.  You should care where your home run to right field went, because it's entirely possible that same shot to left field wouldn't have made it past the warning track.  And the same exact fly ball that ends up being a home run in Yankees Stadium might be an easy out in St. Louis.

Fenway Park in Boston offers maybe the most obvious example of what we're talking about.


A little weird, right?  Left field is strangely shaped,1 and at a relatively puny 315 feet, it's a lot shorter than center.   To compensate for the weirdness, so that right-handed hitters aren't constantly blasting home runs, the left field wall -- the famous "Green Monster" -- is a whopping 37 feet tall, the highest among all current MLB stadiums.

This means that a lot of deep line drives that would be home runs in other parks get sucked up by the Green Monster.  In other words, it's not just the distance of a hit that determines whether or not it will result in a home run; the trajectory matters too.  And, with a little more information, we can start to work out some facts about trajectories.

Let's say the path of any given ball is roughly parabolic.2  The average home run to left field in 2013 reached a maximum height of about 81 feet and traveled a distance of 378 from home plate.3  If we reckon that most home runs begin about three feet off the ground (roughly the middle of the strike zone), that gives us enough info to calculate an "average" trajectory that can be modeled by the equation:

height ≈ -0.0022(distance — 187.2)2 + 81

As you can see, even though the Green Monster gobbles up line drive homers, the average home run would still get out of left field in Fenway.  But what about the other stadiums?  Would the average home run be good enough in most parks?  Here is a table with the distance to and height of every left field wall in Major League Baseball:

Team Distance Height Team Distance Height
Orioles 333 ft 7 ft Athletics 330 ft 8 ft
Dodgers 330 ft 3.83 ft Red Sox 315 ft 37 ft
Mariners 331 ft 8 ft Marlins 330 ft 8 ft
White Sox 330 ft 8 ft Rays 315 ft 9.5 ft
Brewers 344 ft 10 ft Indians 325 ft 19 ft
Rangers 334 ft 14 ft Mets 335 ft 12 ft
Tigers 345 ft 8 ft Blue Jays 328 ft 10 ft
Phillies 329 ft 9.25 ft Astros 315 ft 21 ft
D'Backs 330 ft 7.5 ft Pirates 325 ft 6 ft
Royals 330 ft 9 ft Braves 335 ft 8 ft
Padres 334 ft 4 ft Angels 330 ft 4.75 ft
Cubs 355 ft 16 ft Giants 339 ft 8 ft
Twins 339 ft 8 ft Reds 328 ft 12 ft
Cardinals 336 ft 9 ft Yankees 318 ft 8 ft
Rockies 347 ft 8 ft Nationals 336 ft 8 ft

At first, it seems like we'd have to do a lot of calculating to figure this out.  But notice that the Cubs' stadium (Wrigley Field) has the longest left field fence in the league at 355 feet.  If we plug 355 feet into our formula above, we see that the average home run ball is over 18 feet off the ground (and thus even higher at shorter distances), so it will clear most fences.  In fact, the average home run would be a home run anywhere, so maybe the fact that baseball stadiums are designed so differently doesn't actually matter so much for home run hitters.  Maybe Branch was right after all.

Teachers: want to have this conversation in your class?  Check out our latest lesson, Out of Left Field!

1.  That's Lansdowne Street running right behind the stadium and screwing everything up for everybody.

2.  It's not.  Unless your stadium is in a vacuum, which is uncomfortable for fans and players alike.  But this gives us a first approximation and some tidy math.

3.  That's the estimated total distance the ball would have traveled before returning to a height level with home plate, had it not been impeded by things like bleacher seats, light poles, and spectators' faces.

About Time

Sometimes it feels as though technology is moving so fast that it's hard to keep pace.  Of course if we're talking about transportation technology, then that's often literally true — for example if you're sitting in a metallic contraption moving at more than 1,300 miles per hour — but you get the point.

Actually, planes are a good example.  In 1903 the Wright Brothers made the first successful powered flight.1  The trip was a whopping 120 feet in length, meaning the entire flight occurred over a shorter distance than the wingspan of a 747.  Less than 70 years later, Neil Armstrong put his foot down on the moon.  That's nuts.  But is that typical, or has the rate of technological progress actually been speeding up?

Maybe that's tough to answer on a 20th Century scale, so let's step back a bit.  Like way back.   By way of introduction, perhaps you remember being traumatized by this at some point in your life:

It's only a cartoon, but you can imagine how often this scene played out during the age of the dinosaurs: some poor Stegosaurus getting whooped on by a ferocious Tyrannosaurus Rex.  In fact, it happened precisely zero times.  We have this vision of the dinosaurs roaming the Earth together, but they roamed the Earth for a really, really long time, and different species came and went along the way.  In fact, more time separated the Stegosaurus from the T. Rex than separates the T. Rex from us.

We can think of technology a little like we think of dinosaurs.  Humans have been inventing things for a really long time, and the progression isn't necessarily steady or smooth.  Here are a few timelines that show some of the important technological milestones.

Timelines 1

It took quite a while for Homo sapiens to figure out how to roll stuff around on circular bits of other stuff, but then things started heating up.  After the not-so-impressive 397,000 years it took us to construct a wheeled cart,3 we only needed a measly 5,000 more to safely rocket somebody to the moon.

So that's the big picture: kind of slow going at first, but we got better.  Let's focus on some developments from a more modern period.  Two categories of thing we like to invent are: (1) things to move us around, and (2) things to help us talk to each other.  So here are some advances in transportation and communication.

Timelines 2

Timelines 3

It took us almost 350 years to go from the mass printing of characters on paper to being able to send those letters as electric pulses, but then it was only another 84 years before we could send our actual voices using electricity.  And then only 44 years before we could send our voices out through the freaking air.  You get the picture.

Transportation is similar.  If the Wright brothers could have managed it logistically in 1903, it would have taken them 18 days to fly across the United States.  Only ten years later that time would have been cut down to a little over two days.  Fast forward 50 years, and you can go coast to coast in a paltry six hours.

In an era when the Next Big Thing seems to be coming ever more quickly, what do you think the next 100 years might look like?  It's hard to imagine, but think about people who were born in 1880 and died in 1980.  They would have been old enough to watch the very first automobiles roll down the streets, and they were still alive to watch as human beings routinely launched themselves into outer space.  It feels like almost anything is possible a century from now.

Teachers, want to have this conversation in class?  Check out our newest lesson, About Time!

1.  Well, Orville did.  They took turns piloting over the course of four successful flights that day, but Orville made the first one.  Wilbur, being the older brother, made the first attempt earlier in the week.2  He promptly crashed, necessitating three days' worth of repairs.  I like to imagine Orville saying something like, "Um, maybe I'll take this one, bro..."

2.  Actually, they flipped a coin to decide who would get to take the first crack at it.  Not all big brothers are jerks.

3.  Still only slightly longer than it takes to assemble the average Ikea coffee table.

Coupon Clipping

If we Americans love one thing, it's a bargain.  And we're savvy about it, too.  We know that only suckers pay retail, that the Internet is basically one giant going-out-of-business sale, and that there is always a coupon.  You want to stick us with the sticker price?  WE WILL TURN THIS CAR AROUND.  We will not be hustled.

Of course, we're getting hustled every day.  Much of the time, our beloved discounts are just a thinly veiled fiction.  Exempli gratia, J. C. Penney.  Not only is it the place you have to walk through on your way to the Cinnabon, it's also historically been the King of Department Store Discounts.  Big time.  In January of 2012, in fact, less than 1% of the store's revenue came from full-price purchases.

On the face of it, all those deals seem awesome, but at some point — say, when almost 75% of your merchandise is discounted by at least half — you are no longer offering "discount" prices; you're simply offering what industry insiders refer to as "prices." Except these "prices" are kind of an enormous pain to deal with, because first you have to take what you want to sell for $40 — a couple of sweatshirts, for instance — change all the price tags, then design and print and distribute coupons and signs and banners and fliers and inserts.  Not to mention the emails and Tweets and Facebook posts and Snapchats and whatever thing is happening now.  Plus your inventory management people are borderline hysterical because their normally tame turnover projections are fluctuating wildly with every new "sale," and somebody is screaming because Theresa from Accounting caught an errant projection in the eye.  And basically it's a ridiculous mess.  For example, here are four completely obtuse ways to say "Hey everyone, we'll sell you a pair of sweatshirts for $20 each."

sweatshirts_40 sweatshirts_80 sweatshirts_40 sweatshirts_50
coupon_1 coupon_2 coupon_3 coupon_4


It's especially ridiculous because, as mentioned above, we're savvy.  We understand exactly what's going on behind the scenes, and we've come to accept it as normal.  In fact, the whole reason we know only suckers pay retail is that we've been conditioned to understand that "retail price" is a completely meaningless number whose sole purpose is to let us know how good a deal we're getting when the thing is on sale.  Which is always.

There are a couple of problems with this system.  It's expensive and logistically annoying for retailers, sure; but it's also not great for the consumers. Quick: would you rather get 33% more hand lotion for the normal price, or 33% off the price of a normal sized bottle?  Yup, the discount is a far better deal, even though the giant numbers on my hypothetical sign are exactly the same.  And even if the two offers are actually the same value (let's say 50% more lotion vs. a 33% discount), people will overwhelmingly choose the bonus lotion over the discounted lotion.  We really, really like bonus stuff.  We're savvy, but we're also human, which is to say occasionally irrational.

Enter Ron Johnson, who took over J. C. Penney in early 2012 and promised to put an end to "fake prices."  Instead of inflating sticker prices and then offering about a 40% discount on average, he decided to drop the discounting charade and just lower the retail prices by an average of about 40%.  Simple.  No more meaningless numbers, no more zany sales.  Just stable, reasonable prices that weren't significantly different from what consumers were already paying.  Finally, an end to the madness.  What's not to like?

Apparently everything.  J. C. Penney lost a bunch of money, Johnson got canned, and the company immediately set about apologizing for the error of its ways.  The error being, it seems, that the whole idea was just too reasonable and sane.  Johnson miscalculated badly, because he didn't count on us.  The consumers.  We crave bargains, even when they don't make any sense.  We don't want to pay less money, we want to feel like we're winning at shopping.  And that's a tough habit to kick.

Teachers, want to have this conversation in your class?  Check out our latest lesson, Coupon Clipping.

Sharper Image

Cameras today come with all sorts of bells and whistles.  With so much choice, it can be difficult to understand what separates one camera from another. For many of us, the end result is that we simply use the cameras on our phones. But the question remains: what sorts of features are we missing out on, and if you like taking pictures, are they worth investing in?

That's a pretty big question, and one that probably can't be answered adequately in a single blog post. So let's focus our attention on just one aspect of the camera: the lens, and some of its features.

When you look at a lens online, some of the first numbers you see describe its focal length.  Let's not get into a technical definition; the gist of it is that the longer the focal length, the more "zoomed-in" the camera's view appears.  To put it another way, the longer the focal length of a lens, the smaller the field of view, which measures how much the camera can see.  For example, with a 50mm focal length, a 35mm camera can "see" with an angle of 27 degrees in the vertical direction:


but with a 200mm focal length, the same camera can only "see" around 6.9 degrees:


Now, the zoom doesn't just affect how close you appear to be to the object you're photographing; it also impacts how difficult it is to capture the image at all.  This is because when the focal length is large (i.e. when you're more zoomed in), the camera is more sensitive to motion.  Even small movements are enough to make for a blurry image at a high focal length.

Preview - Apples - Indoors

That's one blurry, zoomed-in picture of an apple.

We can model how much of an error we can expect with a little bit of mathematics.  Suppose you're standing a distance d away from the object you're trying to photograph, and your vertical viewing angle is α.  In this scenario, the camera's total vertical coverage would equal 2d tan(α/2).

But now imagine that your breathing causes unwanted upwards motion by an angle of θ.  The bottom half of your vertical coverage won't change, but the top half will increase; instead of a total vertical coverage of 2d tan(α/2), it will now equal d tan(α/2) + d tan(α/2 + θ):

For example, if you're standing 50 feet away from your subject with a 200mm lens, your vertical coverage will equal 100 tan(6.9°/2) ≈ 6.03 feet.  But if the camera moves upwards by as little as 2°, then the vertical coverage (including the error) then becomes 50 tan(6.9°/2) + 50 tan(6.9°/2 + 2°) ≈ 7.78 feet.  In other words, this tiny movement results in an error of 7.78 — 6.03 = 1.75 feet, or 1.75 ÷ 7.78 ≈ 22% of the total vertical coverage! That much error can only mean one thing: a blurry image.

This is why some lenses (especially ones with long focal lengths) come with vibration reduction (VR) technology. VR effectively reduces movement error by a factor of around 16, i.e. a 2° error without VR technology would be only around a 1/8° error with it.  And as you can see, the resulting error percent becomes significantly smaller too:

But is VR technology worth the additional cost? Well, it depends. If you don't need a lens with a long focal length, then VR may not be worth it, since the percent error you'll get from movement is already pretty small. You also may not need VR technology if you do most of your photography outside with a lot of light. This is because the more light you have, the faster you can make your shutter speed; the faster your shutter speed, the less time the camera has to capture light and create a blurry image.

Q5 - Apples - Outdoors

That same apple looks much less blurry when a picture of it is taken outside.

So in the end, whether to go for VR depends on the kinds of pictures you want to take. If you've ever wished that the phone in your pocket took better pictures, sorting out these kinds of scenarios might just help you make a more informed purchasing decision.

Teachers: want to talk to your students about cameras? Then check out our latest lesson, Sharper Image.


In May of 2013, 84-year-old Gloria MacKenzie walked into a Publix supermarket in Zephyrhills, Florida, to buy a Powerball ticket.  Less than a month later, her son politely escorted her down to the lottery office to claim a record $590.5 million jackpot,1 completing what must be the best trip to a Publix supermarket in human history.

So how does a lottery jackpot get to over half-a-billion dollars?  Unlike most games in the U.S., which are run by state governments, Powerball is one of a handful of lotteries run by a consortium of states2, so players from all across the country pay into one giant kitty.  That can lead to some astronomical top prizes.  But is it worth playing?  I mean, if you're not Gloria MacKenzie?

First, the rules.  There are 59 white balls labeled {1, 2, ... , 59} and 35 red balls labeled {1, 2, ... , 35}.  Five of the white balls are drawn, along with just one of the red balls (the "Powerball" of eponymous fame).  A ticket costs $2, and consists of six numbers (five from 1 to 59, and a sixth from 1 to 35).  You win the jackpot if your ticket matches all five white balls, in any order, and also the Powerball.  So what's the probability that you win the grand prize?

Notice that there is only a single winning combination of numbers, so we only have to count the total number of draws possible.  A draw will consist of a combination of 5 balls chosen from a set of 59, along with a single ball chosen from a set of 35.  In other words, there are:

{ 59 \choose 5} {35 \choose 1} = 175, 223, 510

  possible draws, for a 1 in 175,223,510 chance of hitting it huge.  Not so awesome.

But there are other ways to win at Powerball besides matching every single number.  In fact, there are nine different payouts, depending on the number of matches, and whether your matches include the Powerball.  Here they are, along with their probabilities:

There are a couple of weird things going on here.  For one, notice that the payouts are wildly out of line with the probabilities.3  Also, different winning combinations might have the same payout, even though one combination might be twice as likely as the other.  Thirdly, why is the probability of matching just the Powerball not 1 in 35?4

That last question is actually helpful for figuring out the rest of the probabilities as well.  Sure, you have a 1 in 35 chance of matching the Powerball, but in order to qualify for the bottom prize, you have to also not match any of the white balls, otherwise you would get bumped up into a different prize category.  In other words, your ticket must match the Powerball, and include 5 of the 54 white balls not drawn.  A quick calculation will tell you that:

\frac{{54 \choose 5}{1 \choose 1}}{{59 \choose 5}{35 \choose 1}} \approx \frac{1}{54.41}

The rest of the probabilities can be verified using the same technique.  And once you have all of those, you can work out the expected value of a Powerball ticket.  Well, almost.  Since the jackpot varies from week to week, we can actually ask what the jackpot would need to be in order to produce an expected value of 0, which would represent the minimum grand prize required to justify buying a ticket.

It turns out the magic number is just under $280 million.  So, theoretically, any time the jackpot is greater than that, your Powerball purchase has a positive expected value.  Well, not quite.  So far we've ignored the fact that you're going to have to pay taxes on that money.  A lot of them.  And you're also probably better off, in terms of real dollars, taking the discounted lump sum and investing it rather than take the annuity.  In any case, your actual monetary gain is far, far less than the jackpot.  We've also ignored the probability of having to split the prize!  Jackpots get bigger, so more people buy tickets, so jackpots get bigger, etc.

The actual jackpot you would need to justify a ticket purchase, from the standpoint of expected value, is probably beyond record-setting.  Then again, you might walk into just the right Publix, on just the right day.  You just might make the paper.

Teachers, want to have this conversation in class?  Check out the lesson materials on our site!

1.  That's still (as of this writing) the largest Powerball jackpot in U.S. history, and #3 among all U.S. lottery games.  Unwilling to wait until her 114th birthday for the annuity to pay off, Gloria instead opted for the $370.9 million lump sum.  After taxes, she actually took home somewhere around $278 million.

2.  The Multi-State Lottery Association or — for somewhat impenetrable reasons — MUSL.

3.  And they are probabilities, even though they are (incorrectly) labeled as 'Odds.'

4. The MUSL gets this question so much that it appears on their Powerball FAQ page, which, if you have some time to kill, is hilarious and well written.  Hat tip.

Bundle Up

With today's cable TV packages, subscribers have access to more channels and more content than they could ever possibly consume. For die-hard entertainment fans, the amount of choice is likely a dream come true.

All that choice isn't free, though, and prices for the most popular cable packages have increased fairly steadily over the past 20 years. (And with the pending merger of Comcast and Time Warner, many people are concerned that prices will only continue to rise.) Here's a chart of how a typical monthly cable bill has changed from 1995 through 2012, courtesy of the FCC:

On the other hand, here's a graph of how the number of channels has changed:

From these two sets of data, we can calculate the "cost per channel" — how much you pay divided by how many channels you get — to see whether your money is really providing you with more content or not.

Based on the graph above, for many years the average cost per channel increased; recently, though, the number of channels has grown so quickly that the cost per channel has dipped below its 1995 level.

But again, it's not like everyone with a cable subscription watches every channel they have access to. Most of us have a handful of favorites, and don't really care about a majority of the programming offered. So what would happen if people were allowed to choose which cable channels they wanted to subscribe to? This is often referred to as an à la carte pricing system. That certainly sounds like good news for consumers -- since if you don't want to subscribe to a channel, you wouldn't have to pay for it -- but there's a catch.

You see, not all of your monthly cable bill goes to the cable company itself: a significant portion gets broken up and paid to the individual cable channels. These amounts are called carriage fees, and every cable channel charges them. Here are some example carriage fees from 2012, when the typical cable bill was $61.63:

Channel Carriage Fee
CNN $0.57
Animal Planet $0.09
Disney Channel $0.97
ESPN $5.06
Discovery Channel $0.37

You'll notice that not all carriage fees are created equal.  More popular channels can demand a higher fee to allow cable providers to sell their content to consumers.  ESPN, for example, charges a whopping $5.06 per subscriber (the highest among all channels by a wide margin...the industry average is about 20 cents), while poor little Animal Planet can only wrangle less than a dime.  But maybe you don't really like ESPN, and you'd drop it if you could.  If you were allowed to do just that, you could bring your total bill down to $61.63 — $5.06 = $56.57.

That might sound great, but what's good for you isn't necessarily good for ESPN, or subscribers who might actually want to pay for it. For example, imagine a town of 100 cable subscribers. Under the current system, ESPN makes $506 each month. But under an à la carte  system, if only half the town decided to subscribe to ESPN, it would only make half as much money: $253 instead of $506.  If the channel needed to keep its revenues steady, this would mean that it would have to charge the remaining fifty subscribers twice as much: $506 ÷ 50 = $10.12.

Because ESPN is so popular, it could probably weather the storm, but an à la carte system would likely play havoc on smaller channels.  Organizations like Animal Planet can survive on paltry carriage fees, in part, because they pick up revenue from subscribers who otherwise wouldn't pay to watch.  Essentially, everyone who wants to buy ESPN is charged a mandatory additional $0.09 per month for Animal Planet, which earns the little guy a lot more subscribers than it would get on its own.  If bundling went away, probably so would Animal Planet.  And that's sad...who doesn't like puppies and stuff?

So the most popular channels would have to charge more money to keep their revenues steady, and a lot of the smaller channels would go away entirely.  In the end, doing away with cable bundling could very likely lead to a situation where the average consumer is paying nearly the same amount for a monthly subscription that includes fewer channels (and the industry loses a lot of ad revenue to boot).  Then again, for the consumers who only want a handful of channels, it might be a great deal.  Some Canadian companies offer an à la carte system.  Maybe we should take a cue from our neighbors to the north?

Teachers, do you want to have this conversation in class?  Check out the lesson materials on our site!