I frequently do blatantly inaccurate math just to spitball, and when I say the numbers that I’m computing out loud, people get amazed that I can keep track of so many numbers when I’m only tracking the result of the previous calculation and the operator that I’m about to perform.
I’m like, dude, if you accounted for the rounding errors, you would realize how fucking wrong I am, but this math is not precision-important, and so I’m just trying to get an idea of the scope of the numbers that I need to address whatever problem I’m working on.
For instance, if you asked me to spitball how far it is from Los Angeles, California to Atlanta, Georgia, and how long it would take you to drive that, I would assume you would average about 50 miles an hour after breaks and whatnot that you would be able to drive approximately 12 hours a day, which means you could clear 600 miles, and off the top of my head I would guess it’s about 3,200 miles between Los Angeles and Atlanta, assuming that you stay on the 40 as much as you can once you get to Amarillo, TX, so I would assume that the average driver would take five days and approximately four hours to drive that distance.
This is very off the cuff, off the top of my head, I could be 600 miles off on the distance in either directions, I could be 10, 12 miles an hour in drive time off in either direction, and I could be off 4 or 5 hours or not even account for a co-driver on the trip.
You can do the trip in like 2ish days. I have done the trip in like twoish days.
But, reality and guesstimation are two separate things, and there’s no reason to be amazed to buy somebody’s guesstimation capabilities. It’s very basic math that doesn’t require any skill greater than your multiplication tables.
I frequently do blatantly inaccurate math just to spitball, and when I say the numbers that I’m computing out loud, people get amazed that I can keep track of so many numbers when I’m only tracking the result of the previous calculation and the operator that I’m about to perform.
I’m like, dude, if you accounted for the rounding errors, you would realize how fucking wrong I am, but this math is not precision-important, and so I’m just trying to get an idea of the scope of the numbers that I need to address whatever problem I’m working on.
For instance, if you asked me to spitball how far it is from Los Angeles, California to Atlanta, Georgia, and how long it would take you to drive that, I would assume you would average about 50 miles an hour after breaks and whatnot that you would be able to drive approximately 12 hours a day, which means you could clear 600 miles, and off the top of my head I would guess it’s about 3,200 miles between Los Angeles and Atlanta, assuming that you stay on the 40 as much as you can once you get to Amarillo, TX, so I would assume that the average driver would take five days and approximately four hours to drive that distance.
This is very off the cuff, off the top of my head, I could be 600 miles off on the distance in either directions, I could be 10, 12 miles an hour in drive time off in either direction, and I could be off 4 or 5 hours or not even account for a co-driver on the trip.
You can do the trip in like 2ish days. I have done the trip in like twoish days.
But, reality and guesstimation are two separate things, and there’s no reason to be amazed to buy somebody’s guesstimation capabilities. It’s very basic math that doesn’t require any skill greater than your multiplication tables.
I don’t know why more people aren’t good at it.