The 10-step program
- Acknowledge your feelings
- Identify your skills and interests
- Decide what to do next
- Find a mentor
- Go public
- Form a study group
- Re-train and re-brand
- Settle in
- Mentor others
Let’s break these down.
Let’s break these down.
I had the privilege of attending the 2016 Australian Academy of Science Theo Murphy High Flyers Think Tank in Canberra just recently. I’d only heard about it via a single tweet the day before applications were due, but with the topic of “An interdisciplinary approach to living in a risky world”, my response was: yes please.
We were also asked to choose our preferred topic for breakout-group discussion, and I got my obvious favourite, the technical theme of “Uncertainty, ignorance and partial knowledge”, which turned out to have some focus on decision theory. The session would chaired by Prof. Mark Colyvan, a professor of Philosophy at my alma mater, The University of Sydney, who had recently responded to Luke Barnes’s recent fine-tuning of the universe talk. Some of the recommended reading got me thinking about matters we didn’t get to cover (like how much I don’t like maximin), but I’ll discuss with Mark, and I’m sure I’ll blog about that later. In the meantime, our breakout group spent a couple of hours throwing around our thoughts and ideas and have begun to craft a report and recommendations for the Academy regarding decision-making and risk communication in the face of uncertainty.
My fellow delegates were such interesting people from diverse backgrounds like health, maths, stats, philosophy, history, law, geology, ecology, microbiology etc, and absorbing ideas from these amazing people over the two days provided a complete mental recharge. It was like NYSF for grown-ups. Even the conference dinner speech by emergency doctor David Caldicott was so stimulating, leaving my laughing and crying, I’d dare say it was the “best event speech ever”.
Actually one of the things I most enjoyed at the Think Tank was finding out people’s thoughts on rationality during tea break, as always. As it turns out, most people I spoke to (about this topic, sample size ~5) were adamant that people are at heart, irrational creatures. Only one person (besides myself) thought otherwise. I’ve been told I have to read Daniel Kahneman’s Thinking Fast and Slow to hear more arguments against the assumption of rationality. Apparently there are tests for this sort of thing…
Recently I attended the second ever Bayesian Young Statisticians’ Meeting (BAYSM`14) in Vienna, which was a really stimulating experience, and something pretty new for me, being my first non-astronomy conference. I won a prize for my talk too, which was pretty sweet!
During the two-day overview of theory and a variety of applications by the newest people in the field (read about the highlights over at the blogs of Ewan Cameron and Christian Robert), we heard from a few Keynote Speakers including Chris Holmes. In his talk, he mentioned the world of rational decision makers as envisioned by Leonard J. Savage in his 1954/1972 tome The Foundations of Statistics (adding that on my ‘to read’ list), and went on to describe the application of a loss function and minimax to avoid worst-case scenarios. Minimax isn’t the only approach to decision-making; I think other approaches are more relevant to our behaviour, as I’ll describe later.
“If you lived your life according to minimax, you’d never get out of bed” – C. Holmes
Children are very good at science. They start with broad priors (anything is possible) and learn through collecting data (see picture below) what conclusions are supported best by the evidence. They experiment, make mistakes, and test the variations on a theme. They learn what is dangerous; they learn what is tasty; they learn how to speak.
Our responses to experiences are very similar to Bayesian reasoning. Take trust as an example. If some dudette off the street – let’s call her Margaret – were to recommend a movie, say Moon, we might not heed her words since we have no reason to think we’d have the same taste in movies as her, but if upon watching Moon we found that we quite enjoyed it – we’d be more likely to rely on Margaret’s next tip, say Wadjda. And if Wadjda was also to our liking, we’d probably trust Margaret’s advice when she suggests Fast & Furious 6 (oops). But that blunder would reduce our confidence in her next recommendation, etc. If we define our experience of the movie in binary terms such as “liked” and “disliked”, the situation resembles the classic coin-toss experiment in which one tries to determine if a coin is biased by flipping it many times.