- I can't teach myself data science on my crappy old Windows machine.
- I've decided to get a new Windows machine. Here are the specs.
It appears that two things happened while I wasn't paying attention.
- Apple ceased to exist.
- The cloud ceased to exist.
Some people have farcical explanations for why Apple Cannot be Taken Seriously.
In this specific instance, there was a large investment in Python and Java that somehow couldn't be rebuild in Mac OS X. Details were explicitly not provided. Which is a way of saying there were no tangible "requirements" for this upgrade. Just specification numbers.
Import note. None of this involved "data" or "science." That was the baffling part. No objective measurement of anything. No list of software titles. No projects. No dataset sizes. Nothing.
The anti-cloud argument was even stranger than the anti-Mac argument.
Somehow, a super-large AWS server -- let's say it was a x1.16xlarge -- being used an hour a day (365 day*1 hr/day*$1.82/hr = $664) was deemed *more* expensive that a 64Gb 6-core home-based machine that would sit idle 23 hours each day.
The best part of $664/yr being *more* expensive?
No "data". No "science". No measurement. No supporting details.
I wish I'd kept the email describing how someone who knew something said something about pricing. It was marvelous Highest Paid Person's Opinion nonsense.
AFAIK, they were using 8,766 hours per year to compare AWS computing vs. at-home computing. This meant that an m5.4xlarge should be considered as costing $1,939 each year. Presumably because they'd never shut it off.
It included terms like "half-way decent performance."
There's a depth of wrongness to this that's hard to characterize beyond no "data" and no "science".