Does it make sense to bike without a helmet?

The blog post “Why it makes sense to bike without a helmet” is giving me a headache. It is wrong, wrong, wrong, yet it’s surprisingly difficult to point out exactly why. The author argues that  “if we start looking into the research, there’s a strong argument to be made that wearing a bike helmet may actually increase your risk of injury, and increase the risk of injury of all the cyclists around you.”

The author essentially argues that by sacrificing some personal safety now, he can improve the safety of everyone in the future. That is a laudable attitude. But is he actually doing that? I am becoming more and more interested in cycling safety as I am turning greener and greener. Thus, this needs to be analyzed out.  A faulty argument in favor of a good cause is not acceptable.

The author cites an impressive number of statistics, but the arguments seem to be quite simply invalid. Correlation and causality have been confused, and so on. Multiple errors. It would be easy to shrug it off, but the post has been shared and discussed widely.

Also… it’s way too lazy to just sit on the sidelines and criticize. The author bravely went out on a limb and said something controversial, even though it seems he’s completely wrong.  So, here’s a counterquestion that respects that bravery: are there any conditions under which he would in fact be correct?

The logical chain

Here’s my reconstruction of the main logic of the blog. These are not the exact claims of the author, but something that can be inferred from the text. The mathematical additions are mine.

1. Helmets decrease the risk of serious injury, if a cyclist has an accident. This is a Bayesian variable: p(S|A). p(S|A) is smaller if one wears a helmet.  The probability of severe injury is then p(S)=p(S|A)*p(A)

2. Currently, the probability p(A) of being in an accident is relatively high when cycling. For someone who cycles a lot, it is probably in the range of 1% per year (my estimate).

3. If cities were optimized for biking, the probability of an accident p(A) would be much lower than it is now. Biking might not be any more dangerous than driving a car or walking. At that point, it would be irrelevant whether or not one wore a helmet.

4. To force cities to be optimized for biking, one must motivate the maximum number of people (N) to cycle for maximal amounts of time (T); that is, maximize the amount of cycling, C=N*T. The larger C is, the smaller p(A) will be.  For future reference, note that C can be considered to be general measure of how attractive cycling is perceived to be.

We don’t really know how to model the effect. However, for lack of a better model, we could assume that it follows the exponential distribution p(A)~f(λ,C)=λ*exp(-λ*C) which has mean 1/λ. Since we can scale the constants freely, let us set λ=1. Then, the current probability of an accident is P0=exp(-C0). We want to evaluate how the probablity changes as C changes.

5. Mandatory helmet use is likely to decrease both the number of cyclists, and the time used for casual cycling. We can call this the F-factor, as in “F you”, where F<1. Then the accident probability given mandatory helmets is p(F)=exp(-C0*F) = P0^F.

Rough estimate: if the current personal probability of an accident per year is 1%, and a mandatory helmet decreases cycling by 10% so that F=0.9, then the mandatory helmet would raise the personal probability to (0.01)^(0.9) or 1.6%.

6. Therefore, mandatory helmet use will slow down the target of creating a biking-optimized city, and increase the probability of being in an accident. Up to here, the arguments may actually be valid. However, now it starts to break down.

What is missing 1: Going from big F to little f

There is a problem here. Whether an individual wears or does not wear a helmet does not have any bearing on whether the government does or does not make helmets mandatory.

The author seems to imply that using a helmet is “giving in”: it is a signal to society that cyclists can be trampled on. This sounds vague, but let’s model it in any case. We could consider such an effect to be similar to the F-factor, in that it makes cycling less attractive to everyone. We can even model it similarly, calling it small f.

Using a helmet would thus increase the probability of being involved in an accident to P0^f. Note that by our definitions, f is larger than F; a small effect means that the value of f is close to 1.

What is missing 2: going from probability to risk

Why does this sound completely unsatisfactory? Because we are missing something crucial. We really need to look at risk rather than probability alone. Risk is the product of the probability times the impact (almost literally, in this case). We can call this damage parameter D. (The units could for example be the cost of emergency brain surgery).

The amount of damage we can expect in an accident depends on helmet use. With a helmet it is D0, without a helmet it is D1.  Set D0 to 1 for simplicity. We know that D1>>1. For very serious head injuries, which really are the crucial ones, D1 might be 10 or more.

We can then calculate a damage matrix. The calculation is identical for small f.

Screen shot 2014-05-08 at 11.50.40

The values a-d are the damage we can expect within the given time period for that scenario.  To get some grasp if the values, we can set P0=1%, F=0.9, and D1=2 (a very low value).

Screen shot 2014-05-08 at 11.52.29

Clearly, wearing a helmet causes less damage in all scenarios. However, here is the most interesting question: are there any conditions in which a<d, that is, driving voluntarily without a helmet is safer that driving with a mandatory helmet?  We need D1*P0<P0^F, or F < 1+ log(D1)/log(P0). For the sample values above (P0=1%, D1=2) we require that F<85%. If we assume a more realistic D=10, we require F<50%.

Thus, it is possible to envision scenarios in which driving without a helmet is safer. But are these credible scenarios? We would have to assume that mandatory helmets would decrease cycling by tens of percent (even 50%). Possible, but unlikely.

Even more problematic for the author’s case, we would have to assume that the peer pressure of voluntary wearing of helmets would have an effect that is similar to mandatory helmets. Perhaps, but it cannot be as large as the effect of mandatoriness.

There are in fact other arguments against mandatory helmet use. For example, there is a very real phenomenon called the rebound effect. In this case, if safety is improved by a passive solution such as a helmet, then people tend to engage in riskier behaviors because they feel safer doing so. The end result is that safety is not enhanced; it may even be decreased if the perceived improvement is much larger than the actual improvement.

However, this is not really considered in the blog. The core question is: by choosing to cycle without a helmet, is the author significantly increasing the future safety of others, and also by extension himself? Crunching the numbers: no.

Basically, the author is suggesting a massive and highly likely personal sacrifice, for a fairly small and fairly hypothetical improvement. Such a tradeoff is heroic, but it really does not make much sense.

bicycle-crash

Nuclear propulsion

I’ve been designing a Mars mission with a nuclear rocket. Admittedly this might be a bit much for a one man operation. It grew out of desire to render a NERVA II rocket engine with blender. Although I’m not known to be detail oriented the things that I try to model should look at least a little bit like they might look if they were actually made some day, so I used “existing” hardware to estimate weight of a spaceship and then plug the numbers in to the rocket equation. After some tuning I came up with a two stage space tug that has about 1.5 Gg of mass at low earth orbit. This contraption should be able to transfer five BA 330 modules and 200 Mg of cargo to Mars orbit.

The propulsion unit for my design. It needs six.
A propulsion unit for my design, it needs six.

Before anyone gives harsh critique on the numbers: this is a very notional design: I’d be happy if the numbers are within an order of magnitude of the correct ones. And did I mention that much of the stuff is sort of vaporware or less real.

Close up of the NERVA II. To show the scale, the big cube is 10^3 m3, the green speck is 0.1^3 m3. The Hydrogen tank is almost 47 m long.
Close up of the NERVA II. To show the scale, the big cube is 10^3 m3, the green speck is 0.1^3 m3. The Hydrogen tank is almost 47 m long.

I like physics, I like rockets and almost anything space related, so for me this type of thinking by doing is fun. What is more surprising is that some quite serious people have thought that this could actually be done. Nuclear rocket engines have been proposed and tested ( “direct” nuclear jet engines too).

“Steady progress was made in engine efficiency and controllability, and in lowering the release of radioactivity” [from here]. Just to make it clear these beasts were no sissy nuclear electrics, the idea was to spray a hot reactor core with hydrogen. Several designs were tested in the atmosphere. The word that surely comes to mind when thinking about this sort of engine test is erosion. One would expect that active parts of the core would be spewed out of the hot end even in normal operation.

There is always the possibility of the not so unlikely turbopump failure. While my limited knowledge suggests that because there is no need for an oxidizer it is a bit easier to design one, the eventual pump failure could still lead to a loss of coolant. Not to worry, they tested (KIWI TNT) what happens if you stop the coolant. Boom.

While my design sketch is a space tug, i.e. it would never be used in the atmosphere thus limiting the release of radioactive substances to the biosphere, these engines were also suggested as upper stages for chemical rockets to boost performance. Then there is of course Project Orion, which from the current viewpoint boggles the mind.

No point, just some perspective.

Is aviation safety shameful thing: Final summary

Safety is an important part of aviation. Although many customers do not care, we feel that it should be transparent to those customers who are. Studying airline web pages showed large variations between airlines, but to summarize: it appears to us that the majority of airlines want their ordinary passengers to think of flying as a non-technical activity that entails no risk, and hence no need for safety measures.

However, some airlines do go to significant depths about their safety procedures. Many of those airlines are in developing countries with poor safety records, and appear to use safety as a marketing tool to reassure customers. However, some well-known Western airlines also have a similar approach. In essence, we found no external parameters that would explain the differences.

We interpret this to mean that safety can be used as a marketing tool. Some airlines choose to use it; some do not. Nothing external forces an airline to be transparent or opaque about its safety culture; rather, this is a (business) decision that is made by the company.

Three aspects were studied (see also full project page).
Report 1. Do airlines make safety information available to their users on their main web pages? (By Jakke Mäkelä)
Report 2. Do airline web pages have any mention of accidents or incidents that have happened? (By Niko Porjo)
Report 3. Is there any external factor that would systematically explain any differences? (By Niko Porjo)

Report 1: The web sites of 83 major airlines were analyzed. Only 35% seem willing to even mention safety on their official web pages (what we decided to call a “safety-positive” approach towards customers). Airlines in developing countries were more safety-positive; up to 65% of them used safety as a marketing tool. However, this was not a hard-and-fast rule; some developed-nation airlines like British Airways and All Nippon Airways had a very large focus on safety issues.

Report 2: The web pages of 46 airlines were scanned in detail to see whether any information at all could be found about accidents that had occurred to the airlines. Out of 37 airlines with a fatal accident, 10 mentioned the accident somewhere on the web page. However, the information was technically quite shallow.

Report 3: A simple metric was used, where number of hits on a search for keywords “safety” and “accident” was used as a proxy for the amount of accident information that the airline wishes to make available. This number was correlated with a number of internal parameters that could affect it (such as airline size), as well as external parameters such as the GDP and Global Integrity Report score of the carrier country. No statistically significant correlations were found.

The overall impression is that for any ordinary passenger ordering a ticket and browsing around the web site, the majority of airlines do not wish to bring up the issue of safety in any way. The factors that are emphasized are price and quality. When anything more is described, it is positive things like social responsibility, equal opportunity, sponsorships, and  so on. However, there is a dichotomy: those airlines that do mention safety tend to do so extensively.

Those are fairly objective facts; what personal opinions should we draw from them? We are rather surprised that so few airlines choose to be open about safety. Silence on this issue does not benefit the customers. Customers should be able to make informed choices, and this includes understanding the safety record of the airline.

Perhaps it does not benefit the airlines either.  In a culture of silence, safety only becomes visible when disaster strikes. The easiest way for an outsider to understand the airline’s safety culture is to read accident investigation reports on how it failed. This is hardly positive advertising. Would it be possible for airlines to utilize their safety culture in a more proactive way?

Accident information comparisons

 

Safety and accident related information is available at the websites of airlines, but you will find more of it if you use Google. There were also a couple of interesting peculiarities in the data, where it seems possible that the airline was hiding information. See also Part 1, this post is part of our “Is aviation safety a shameful thing?” project.

In this part two I will compare the number of safety/accident related links that were found by the airline’s own search to the number found by Google when it was limited to the website in question. Both link counts are also analyzed against against information about the airlines and their home countries. The intention is to find out how open airlines are with this information. Absolute numbers show how much information is available, relative numbers show how well it can be found with the search provided by the airline and might give a hint about how desirable it is to the airline to show that information. Comparison with other data might reveal factors that are common to airlines with high or low number of links.

I searched through 46 airlines. Figure 1 shows the raw link counts. The x-axis shows how many links were found and the y-axis shows the number of airlines that had that count. The large blue and orange bars at x=0 show that for many airlines the homepage was a poor choice for finding safety or accident info.

On the other hand Google is able to find information (yellow and green bars) on both subjects and in some cases quite a lot of it. It should be noted that I only counted to 10, if there were more links I ignored them. This document of the raw data shows in more detail which links were accepted to this data set.

Links to anything that the passengers would find out during the trip, such as pre-flight safety announcements, were rejected. Another category that was not accepted were links to insurance terms and conditions.The reason being that I am interested in what “extra” information is available at the website.

Figure 1. Number of links found for both searches and words.

I’d like to be a little cautious when making conclusions based on this data, mainly due to the low number of airlines, but also due to the data gathering process. Namely it was done by me alone without much help. In my experience this leads to a less rigorous result than a group effort. But one thing seems to be pretty certain: Google is better in finding this information than the search functions on the airline web sites.

This is true even if those 12 airlines that didn’t have a search are removed from the zero column. For the whole set, when the number of links found for one airline by one search is summed; Google finds more links in 39 cases while in only two cases the homepage search returns more results (Qantas 5 vs. 4 and Czech Airlines 7 vs. 5)

At least one of the airlines uses Google to power their search (US Airways). This offers an interesting comparison: US Airways homepage search found 3 safety and 1 accident related link, while the general google search found 1 safety and 7 accident related links.

While I was not logged in to my Google account, it is possible that Google had picked up on the fact that the same computer had been intensely searching for accident info for several days and used this knowledge to show what was most interesting to me.

A more sinister explanation is that the results by the search provided at the homepage have been filtered not to include what I was looking for. Searching the US airways site with the site’s search for “1549” gives (18 March 2012 ) one result about a general chronology of the airline and tells that some results have been omitted. If one includes those, four more links to the same chronology are included. It is still possible that this is a result of some more general decision not to include parts of the web site in the site search, but I’d say there is a good possibility that this is intentional.

In the case of Kenya Airways, Google search gave two links to the accident of KQ 507 but when I followed those links they gave a 404 (i.e page not found). This could be due to several reasons and need not be intentional. The accident was mentioned in an annual report.

Table 1. Mean and median number of links found Google for different sub populations

 

Table 1 shows the mean and median number of links found by google for different sub populations. “Whole set” includes all the airlines, while “Google and Homepage” includes only those cases where both searches were available and “Google only” includes only the cases where there was no homepage search.

In all cases there are more links related to safety than to accidents, but the difference is not massive. Results for the word “Safety” show no definite differences between the populations. For “Accident” airlines with their own search show more info. This difference could be explained if the airlines with no homepage search had had fewer accidents, but in only 3 cases out of 12 I couldn’t find a fatal accident in the history of the airline. Four out of the 12 airlines without homepage search function are low cost airlines which might have less expansive websites and therefore less information. This result is similar to what Jakke saw in his analysis of airline homepages.

I compared the link counts against a data set ( or here ) with info on

  • number of employees
  • number of yearly passengers
  • revenue
  • year the airline was founded
  • GDP (PPP) per capita of the airlines home country
  • global integrity report overall score of home country
  • corruption perception index
  • IATA membership
  • date of latest accident

It was difficult to find all the data for all the airlines so there are some gaps. The data is also unreferenced and from various sources. Some plots with very short description are available here. There is a modest correlation between the date of latest non fatal accident and total number of links found,  which just might be significant. There is also a modest correlation between the Global Integrity Report overall score and total number of links. But the plots show that in addition to the set being quite small there might be other data related difficulties that make this type of analysis less trustworthy.

Overall the small numbers in table 1 suggest that openness is not the approach chosen for these subjects. Further, there is accident related information at many airline websites but you might not find all of it with the search provided by the airline.

In the third part of this series I will attempt to rate the links and see if any info comes out of that

Aviation safety: We made a mistake and learned, or did we?

 

 

Before Jakke’s post on aviation safety we had a discussion on the likelihood of airlines documenting accidents on their web sites. I think it would be a good thing to actually show when mistakes are made. It would be especially good to show what has been learned and how the organization has responded to improve safety. That being said, we all thought that no info would be available.

To complement Jakke’s findings I searched for the most recent fatal and non-fatal accident for  a semi-random collection of airlines. I then made an attempt to find information (or at least some reference) to those accidents on the web pages of the respective airlines. I mainly used Wikipedia as a source of accident dates since it is easy to use, fairly trustworthy and also lists non-fatal accidents for many airlines. If nothing was found for an airline, I tried googling a bit to check if it was likely that there had actually been no accidents. Altogether I went through info on 46 airlines.

Contrary to what we thought, information is  available. Sometimes cases that took place before ARPANET was functional can be found. In the figure below, each of the accidents I found is shown with a dot at the year it happened. The count goes up each time I was able to find the accident in the web site of the airline. As can be seen, most of the references were found when the accident date was after the year 2000 (steep slope at the end). This is natural if the web site is not considered to be a repository.

In 37 cases I found a fatal accident related to an airline, and in 10 of those cases there was a reference to that accident in the web site. This may not sound like much, but it was much more that I believed it would be. But, and there is a but, this info was not meant for customers. It isn’t very in-depth info either. It is mostly a short paragraph in a financial statement or a press release. In the figure below from left the pillars are: number of airlines that were checked for a known accident, number of airlines that had a reference to the accident, number of cases where the reference was in a financial statement, number of cases where the reference was in a press release and number of cases where the reference was somewhere else on the web site.

While I’m kind of happy that there is honesty about the fact that there are accidents and incidents, I’m disappointed at the level of technical info released. For example in only one case did I find a link to the accident investigation report, here.

While I was looking for this info I also made some other notes related to this subject, and will write more about them in another post.

Some more info and less opinions is available here, and the spreadsheet I used can be seen here. This post is part of our “Is aviation safety a shameful thing?” project.

 

Translate »