Another AGU Fall Meeting is in the books (for me at least…it doesn’t actually officially end for another few hours). Thanks to everyone who came to my talk yesterday and for the useful feedback on potential uses of the new method I introduced. I’m looking forward to seeing everyone again next year in DC.

One of the great things about the AGU meeting is the diversity of science presented. I’ve really become a fan over the past few years of the Nonlinear Geophysics section. The award for the presentation that taught me the most this year goes to NG34A-07 (for those of you who don’t speak AGU: Nonlinear Geophysics’ Wednesday evening’s 7th presentation) on “Efficient simulation of tropical cyclone pathways with stochastic perturbations”. How does one compute something like the 99th percentile of a distribution of simulations without conducting at least 100 simulations? The easy answer is that you can’t. But the hard answer is that you can do some clever rescaling. If we’re interested in the most intense storms we can simulate given some set of conditions, run 8 (or whatever) simulations 25% of the way. Select the most intense of those and discard the rest. Give those simulations some stochastic noise and continue. Then repeat. In the end, you’re left with the most intense storms only. That would only be mildly informative expect that the presentation also included a way in which to determine what the actual likelihood of those simulated storms is given the initial weak storms. Abracadabra, you can find the 99th percentile (in a way affected by stochastics). Very cool!