Deaths due to terrorism in the UK

I came across the graph above and I was immediately struck by the stories it tells by forcing you, the reader, to ask the obvious questions.

Clearly something happened in the 1990’s.

The peace process was begun in Northern Ireland, culminating in the Good Friday Agreement of 1998. Surely this graph alone is enough to convince anyone of the importance and historical significance of the Good Friday Agreement? Why would anyone do anything, anything, to jeopardise its continued success? If anyone should need any convincing that we shouldn’t, we mustn’t, return to a hard border on the island of Ireland, then surely this graph must be all it takes.

86% of the deaths between 1970 and 1990 were in Northern Ireland

1988 – includes 271 deaths due to the Lockerbie bombing, when Pan Am flight 103 from Frankfurt to Detroit, via London & New York, was destroyed in the air over the Scottish town of Lockerbie by a terrorist bomb.

2005? The tragedy of the London bombings, or 7/7

A simple, sobering graph, but one that deserves – demands – to be viewed.

Posted in Handling Data | Leave a comment

The Aeroplane Seating Problem

As ever, the publication last week of A level results, and the imminent release of GCSE results later this week, signal the beginning of the end of the wonderful long summer holidays.

I hope you’ve had a fantastic time and, possibly, enjoyed the delights of foreign travel. If you did, I doubt you flew with an airline with such a relaxed seating plan as the one in the problem below …

On this particular flight, MathAir 314, there are 100 hundred passenger seats, and 100 passengers.

The first passenger to board has lost their boarding pass and doesn’t know their seat number. “No worries” declares the helpful steward, sit wherever you like.

The next (and all subsequent passengers) does have her boarding pass – if her seat is free, she sits in that seat. If it is occupied, the helpful steward allows them to chose any unoccupied seat they wish. This continues until all 100 seats are filled by the 100 passengers and the jet departs for its destination, Angle C (Anglesey – geddit?)

The question is:

What is the probability that passenger 100 gets to sit in their own allocated seat?

I’m indebted to Zoe Griffiths, @ZoeLGriffiths for this problem and you can see her video introducing the problem and, more importantly, her solution in the video below. But before you watch it, have a go at solving the problem yourself first. (You can start the video and the pause it after 1min 30 sec to see her intro to the problem.)

How might I use this problem? I might introduce it to a class towards the end of a lesson, and ask them to go away and think about it, reporting back with possible solutions, or even just approaches to a solution, next lesson.

Or I might offer the problem at the end of a weekly department meeting, inviting colleague to think about it before next week’s meeting. PE teachers regularly play their sports for fun, music teachers their instruments, its important for maths teachers to remain engaged with the subject and “do” some maths from time to time.

So, here’s the video with the solution, but give the problem some thought before you hit play.

 

Posted in Probability, Puzzles, Teaching Tips | Leave a comment

Were the refs more lenient on the hosts in Russia 2018?

We may be on the eve of the new domestic football season, but before we consign the World Cup, Russia 2018, to history, I just want to share this plot with you.

It compares the fouls committed per game by a team, and the number of yellow cards received.

By eye, there does appear to be some correlation between the number of fouls committed by a team and the amount of yellow cards they received. This would be expected.

The hosts, Russia, do appear to be an outlier – the third most fouling nation, yet their card count was low.

This could be for a number of reasons – e.g. if many of the fouls they commit were for, say, offside, you wouldn’t expect them to be receiving yellow cards for those offences. But everyone loves a good conspiracy theory – perhaps the refs, either intentionally or subconsciously, were a little more reluctant to flash their cards for the home team …

It prompts an interesting question to explore for the forthcoming season: do away teams get carded more frequently than home teams?

Many thanks to Answer Miner  for creating and sharing the plot. You can see his original tweet with it here.

Posted in Handling Data | Tagged , | Leave a comment

Saints or Sinners?

The football may be over, but the fun never stops!

There is plenty of data on the recent Russia 2018 World Cup to be found on the Official Fifa site

Using their statistics, I have compared the number of fouls committed versus the number of fouls suffered and plotted the scatter graph above.  Fouls committed are on the x axis, fouls suffered on the y. The line is a (computer generated) line of best fit using linear regression.

The greater the distance above the line, the more “saintly” we can say a team was – more fouled against than fouling; those below the line were the “sinners” of the tournament.

Using my criteria, we can say that, despite not coming home with the trophy, England were the Saints of the World Cup!

(A note of caution, however. As ever with data, we must always consider its validity.  Despite this data coming from the official FIFA website, it has a total of 1734 fouls committed,  but only 1642 fouls suffered – at the time of writing I can’t reconcile the difference.)

UPDATE

A few readers have (correctly) pointed out that the plot is skewed as not all teams play the same amount of games: one would expect France, Croatia, Belgium and England to all be towards the right of the graph as they played more games than other teams.

So I went back to the data produced a plot for fouls committed per game v fouls suffered per game. You can see the plot below. I think we can safely say those above the line were the saints, those below the sinners.

 

Posted in Handling Data | Tagged , , | Leave a comment

Anyone for tennis?

And so the sun sets on another Wimbledon tournament, one that will be remembered in part for the two losing singles finalists.

Serena Williams was the runner up in the ladies final, ten months after giving birth, and Kevin Anderson lost in the men’s final to Novak Djokovic, two days after playing the second-longest match in Wimbledon history, taking six hours, thirty six minutes to overcome John Isner in the semi-final.

Understandably, Anderson has called for a change in how close games are decided.

To win a tennis set, a player must win 6 games, and be two clear games ahead of their opponent. If the score is 6-6, the set is decided by a tie break, where the winner is the first to 7 points (as long as they lead by two clear points.)

Except for the final set of a match. If the fifth set is tied at 6-6, it continues – with no tie break – until one player leads by two games. And this is where the problem lies.

If both players are good and evenly matched (a reasonable expectation in the semi-final of a Grand Slam Tournament) then the maths tells us there will be a significant number of games before a player loses a game on their serve. i.e., stalemate sets in, with neither player able to break the opponents serve and win by the required two game margin.

The score in Friday’s final set was 26-24, i.e. it took fifty games to decide the final set.

Looking at the stats for the match, Anderson won 213 points from his 278 serves, giving him a probability of winning any point on his serve of 0.7661. Isner won 206 of his 291 serves, giving him a probability of 0.7071 of winning a point when he served.

To win game, you need to win (at least*) 4 points on your serve, with your opponent winning winning no, one or two points. We can use a bit of basic probability to work out the probability of winning “to love”, ” to 15″ and ” to 30″.

* The problem is made harder as we need to consider “deuce”, when the score reaches 40-40. Due to the need to win by two clear points, “deuce” games could go for ever. The good news is, is that we can model “deuce” games as a geometric series, which we can sum to infinity, thereby coming up with a probability for winning a “deuce” game.

Using the probabilities above, I was able to calculate the probability that each player would win a game when they were serving. More importantly, this allowed me to the calculate the probability they would lose a game on their serve, or have their serve broken.

To win the final set, the game would go on until (at least) a player lost a game on their serve, hence we could treat the game as a geometric distribution, meaning we could calculate the “Expectation” for games lost.

For Anderson, the expectation is 25, that means, you would expect him to lose one game for every twenty-five he plays.  The number is a little lower for Isner: we would expect him to lose one game in every eleven.

What this means is is that we can expect long final sets, unless the pragmatic decision is made to revert to allowing tie-breaks in fifth and final sets. As players continue to improve, and the advantage of serve continues to increase, the sport’s administrators will have to grapple with this conundrum, or look forward to future marathons as the sport descends into a war of attrition.

 

If you are interested in the formula I derived to calculate the expectation, you can see it below. p is the probability of a player winning a point on their own serve.

Expectation to lose a game on serve. p is the probability of winning a point on serve. The formula gives the average number of games in which one game would be lost (the others won) For example, Kevin Anderson has a probability of 0.7661 winning a point on his serve. The formula gives an expectation of 25 (rounded to the nearest whole number). This means we would expect him to lose one game in every twenty five he plays. (Note: it does not mean we would expect him to win the first twenty four then lose the twenty fifth, but in a series of twenty five games, he would lose one game.)

Posted in Probability | Tagged , | 2 Responses