By Tom Cranstoun
In 1977, I made what felt like a monumental investment: $795 for a Commodore PET computer. But that was just the beginning. The need for storage led to another equally significant purchase – a disk drive system for another $795. That's a total of $1,590 in 1977, equivalent to approximately $7,950 in today's money. For context, that total investment would have bought you a decent used car back then. The PET was revolutionary for its time, featuring a built-in monitor, keyboard, and cassette deck in one integrated unit. It came with a whopping 4KB of RAM (yes, kilobytes), ran at a blazing 1 MHz, and with the disk drive, offered unprecedented storage capabilities for a personal computer.
Fast forward to 2025, and I find myself contemplating another significant investment in computing technology, this time for running the full DeepSeek-R1 AI model locally. The parallels in terms of technological ambition – and cost – are striking.
The Modern Challenge: Building a DeepSeek-R1 Capable System
Just as the Commodore PET with its disk storage represented the cutting edge of personal computing in 1977, running a full-scale language model locally in 2025 requires pushing the boundaries of current consumer hardware. Here's what it takes:
Core Components
Processor (CPU)
Recommendation: AMD EPYC 9005/9004/7003 SeriesCost: $2,000 - $4,000
The computational heart of the system, offering power that would have seemed like science fiction in 1977
Memory (RAM)
Recommendation: 768GB DDR5 RDIMMCost: $3,000 - $4,000
Compare this to the PET's 4KB – we're talking about nearly 200 million times more memory
Motherboard
High-end EPYC-compatible boardCost: $500 - $1,000
Required to orchestrate all these powerful components
Storage
2TB NVMe SSDCost: $200 - $300
Faster than the PET's disk drive by a factor of millions, and about 20,000 times the capacity
Power and Cooling
1000W PSU and advanced cooling: $300 - $400The PET drew about as much power as a desk lamp; this system needs serious power management
The PET Equivalency
RAM: The DeepSeek system's 768GB of RAM equals 196,608,000 original PETs (each PET had 4KB)Processing: With the EPYC running at around 4.1 GHz compared to the PET's 1 MHz, that's 4,100 times faster per core. Multiply that by 96 cores, and you'd need roughly 393,600 PETs to match the raw processing power
Storage: A modern 2TB NVMe drive equals about 20,000 PET disk drives (each had roughly 100KB of storage)
Cost Equivalent: At $795 each, you'd need to spend over $312 million (not inflation adjusted!) to get the same RAM capacity using original PETs
The Price of Progress
The total cost for a DeepSeek-R1 capable system ranges from $6,100 to $10,900. Add another $1,500 - $2,000 if you want to include a high-end GPU for additional acceleration. Interestingly, this range is not too far off from the inflation-adjusted cost of the complete Commodore PET setup ($7,950). When you consider that many enthusiasts might opt for the higher-end components plus a GPU, the total investment ends up being remarkably similar.
What Does This Tell Us About Technology and Value?
This price comparison reveals something fascinating about the economics of cutting-edge technology. Whether we're talking about 1977 or 2025, pushing the boundaries of what's possible on a personal computer seems to consistently cost about the same in real terms – roughly the price of a decent used car.
But what we get for that money has changed dramatically:
The PET with its disk drive could handle basic programming, simple calculations, and data storage, it even ran a version of Lisa, AI in the ‘70s
A DeepSeek-R1 system can process and generate human-like text, reason about complex problems, and assist with sophisticated analytical tasks
Looking Forward
Just as the Commodore PET helped pave the way for the personal computing revolution, today's local AI setups might represent the beginning of a new era in personal artificial intelligence. Will we look back at these systems in 48 years with the same nostalgic amusement with which we now view the PET and its disk drive?
The cost of entry to cutting-edge computing hasn't changed much in relative terms, but the capabilities we get for that investment continue to expand exponentially. It makes one wonder: what will be possible with a similar investment in years to come?
Whether you're considering building a system for DeepSeek-R1 or just curious about the evolution of computing, it's worth remembering that today's cutting-edge technology, like the pioneering machines of the past, represents more than just its components – it's an investment in the future of computing itself.
A Fun Thought Experiment: Building a PET with Modern Components
Just for amusement, let's calculate what it would cost to build a computer with Commodore PET specifications using modern components:
RAM: 4KB of modern RAM would cost... about $0.001 (a tiny fraction of even the cheapest RAM stick)
CPU: A 1 MHz processor would cost... effectively nothing. The cheapest modern microcontroller running at 1 MHz would be around $0.50
Storage: 100KB disk storage would cost... about $0.0001 (an almost immeasurably small fraction of even the cheapest modern storage)
Display: A monochrome display similar to the PET's... maybe $20 for a basic LCD
Keyboard: Basic membrane keyboard... $10
Case: Simple plastic enclosure... $15
Total cost for PET-equivalent specs today: Around $50, mostly driven by the physical components (case, display, keyboard) rather than the computing elements. The actual computing power in a PET would cost less than a dollar to replicate with modern components!
This really drives home how far we've come – the computational power that cost $795 in 1977 would be essentially free today, while our $10,900 DeepSeek-R1 system would have been worth over $312 million in 1977-era technology. That's a staggering ratio of 28,708:1 – for every dollar you spend on computing power today, you're getting nearly thirty thousand dollars worth of 1977 computing capability!
The Smartphone Perspective
It's worth noting that while we're discussing building a dedicated system for running AI models, many of us are carrying incredible computing power in our pockets. A modern iPhone 15 Pro, retailing at around $1,000, packs 8GB of RAM, up to 1TB of storage, and an A17 Pro chip capable of 19 trillion operations per second. It's fascinating that this pocket-sized device has more RAM than 2 million Commodore PETs combined, yet only about 1% of the RAM we need for running DeepSeek-R1. This illustrates an interesting divergence in computing requirements: while general computing tasks have become incredibly efficient and miniaturized, running full-scale AI models locally still demands the kind of hardware scale we haven't seen since the mainframe era. We've come full circle – just as the PET brought mainframe-like computing into homes in 1977, enthusiasts in 2025 are now bringing datacenter-scale AI capabilities into their personal spaces.
The Economic Perspective: Build vs. Subscribe
Let's consider the economics of building a DeepSeek-R1 capable system versus using cloud-based AI services at $200 per month:
The initial hardware investment for this build ranges from $6,100 to $10,900, with an optional GPU costing an additional $1,500 to $2,000. Assuming heavy usage, Monthly power costs are around $50-75.
The break-even point for the basic build ($6,100) is approximately 31 months, while the high-end build ($12,900) has a break-even point of roughly 65 months. Factoring in power costs extends these timeframes by about 4 months.
This means it would take between 2.5 to 5.5 years to recoup the investment compared to monthly AI services. However, this calculation assumes stable service pricing and doesn't factor in potential hardware resale value or the benefits of having unlimited, private, and latency-free access to the model. Just as buying that Commodore PET provided value beyond its pure computing capabilities, running AI locally offers advantages that go beyond simple cost comparisons.
The Big Ask
Ok, the big ask: How do I convince my long suffering wife, Eleanor, that I need this.