Imagine yourself at a casino with $100 but in need of $1000 by morning. The question is not whether to gamble, but how to do so to maximize the probability of reaching your goal. This is an example of a stochastic control problem. Other examples are how to invest retirement funds in the stock market to maximize the chance of accumulating a given fortune by retirement age, and how to select treatments for a sequence of patients to maximize the expected number of successes.
A casino game such as roulette is a "one-person" game since the payoff to a player depends only on that player's bets together with the rules of the game and the spin of the wheel. In other games such as poker there are several players and the payoffs depend on the actions of all the players. More complicated "many-person" games are often used to model the activities of consumers and producers in a market economy.
I like to think about concrete control problems and games like these, and I enjoy studying the associated mathematical problems -- some of which are quite delicate and require advanced mathematical machinery. I also like to think about the foundations of probability and statistics. My favorite approach to the foundations is that of Bruno de Finetti, who suggested defining probability in terms of gambling odds.