I think it’s a tragedy of the English language that the words “million”, “billion”, and “trillion” sound so similar. The actual difference between each number is hard for people to imagine, even programmers who are used to dealing with large quantities and orders of magnitude. My current favorite illustrative example is:

Let’s say you have an operation that takes 0.5 ms to complete and you need to run it on a large data set. Running it on a million things might take ten minutes, a nice coffee break. Running it on a billion things might take six days, a background process you really don’t want to interrupt or run twice. Running it on a trillion things might take 16 years. Don’t do that.