VOGONS


First post, by 1235

User metadata
Rank Newbie
Rank
Newbie

Using AI to Resurrect Old Games: From Resolution Patches to Full Remakes
A practical guide and discussion starter for the preservation community

How This Started
I was thinking about Lands of Lore 2 — a great Westwood game from 1997 — and whether modern AI could fix its 640x480 resolution limitation to run natively at higher resolutions on modern hardware. That one question opened up a much bigger conversation about what AI can actually do for game preservation right now, and where it is heading.
This post covers what I learned, what seems realistic, and a rough guide for anyone who wants to try this themselves.

Part 1: Native Resolution Patching with AI
The Basic Idea
Lands of Lore 2 is a Windows 95 DirectDraw game locked at 640x480. Getting it running at higher resolutions natively means finding where resolution values are hardcoded in the executable and patching them — which has traditionally required a skilled reverse engineer with tools like Ghidra or IDA Pro.
The idea is to use an AI as your co-pilot in that process, with a feedback loop so it can actually see what is happening.
What You Need
Hardware:

A Linux PC works great for this — Wine, scripting tools, and Ghidra all run well on Linux
Enough storage for the game, logs, and build artifacts

Software:

Ghidra (free, open source disassembler)
Wine with DDraw debug logging enabled (WINEDEBUG=+ddraw)
A screen capture tool like scrot or ffmpeg
Python for gluing everything together
Access to Claude API (Anthropic), Kimi API (Moonshot), or similar

The Feedback Loop
This is the key insight that makes AI-assisted RE actually work. Without it, AI is just guessing in the dark. With it, things get interesting:

AI proposes a patch or asks for specific disassembly output
Automated script applies the change and runs the game under Wine
Screen capture grabs the result
AI sees what happened ("still 640x480", "UI broke", "crash on startup")
AI iterates based on what it observed

Wine's DDraw logging is particularly valuable here — it shows you exactly what resolution and surface creation calls the game is making, which can shortcut a lot of binary digging.
Why LoL1, LoL2, and LoL3 Are Different Problems
Lands of Lore 1 (1993, DOS): Hardest. This is a DOS game using Mode 13h style video at 320x200. The graphics routines are likely hand-written assembly. Most people solve LoL1 resolution via DOSBox scaling rather than native patching, and honestly that is probably the right call. A native resolution hack here is a serious undertaking.
Lands of Lore 2 (1997, Windows 95): Medium difficulty. DirectDraw software rendering. The feedback loop approach described here is well suited to this one. Best starting point for experimentation.
Lands of Lore 3 (1999, Windows): Probably easiest. Being two years newer it likely uses Direct3D more heavily, which paradoxically makes resolution changes easier since D3D handles more of the scaling math. Worth checking first if there are hidden resolution arguments or registry keys before doing any patching at all.
Rough Cost Estimate (2026)
Using Claude Opus for the hard reasoning, cheaper models for bulk work, with prompt caching on repeated disassembly context:

LoL2 resolution patch only: roughly $200-500 USD in API costs
Running 10 parallel instances exploring different hypotheses simultaneously probably does not cost 10x — more like 2-3x — because failed branches get pruned quickly

By 2027-2028 these costs will likely be 5-10x lower as model pricing continues to fall.

Part 2: Full Remakes from Video — A Different Approach
The Insight
What if instead of reverse engineering the binary, you just... watch the game?
YouTube has multiple full playthroughs of all three Lands of Lore games. GameFAQs has detailed guides, maps, item lists, and stat breakdowns. Speedrun communities often know game internals surprisingly well. Together these sources can serve as a complete game design document — without ever touching the executable.
This approach is:

Significantly cheaper than binary RE
Faster to produce playable results
On much safer legal ground (you are recreating the experience, not the code)
More accessible to people without RE skills

The Pipeline
Stage 1 — Video Analysis
Feed walkthrough videos to a multimodal AI. Google Gemini 2.0 Pro is currently the best choice here because it can ingest hours of video natively with a very large context window. You prompt it to extract everything systematically: room layouts, enemy behaviors, UI elements, dialogue, item pickups, puzzle solutions, game flow.
Multiple playthroughs from different players catches more content — secrets, optional areas, alternative paths.
Stage 2 — Knowledge Assembly
Scrape and feed GameFAQs guides, fan wikis, speedrun documentation, any surviving fan sites. A cheaper model processes all this and builds a structured game bible covering every system, area, item, and formula.
Stage 3 — Asset Generation

Environments and sprites: Stable Diffusion or Flux trained on video frames to recreate the visual style
Music: AI audio generation tools like Suno for recreating the Westwood feel, or extract audio directly from video
3D assets for LoL3: AI 3D generation tools guided by video frames

Stage 4 — Game Construction
Godot is the recommended engine — open source, Python-like scripting, and AI models are well trained on it. The agentic loop from Part 1 comes back here:

Opus or similar handles high level architecture and complex game logic decisions
Cheaper models handle bulk code generation — rooms, items, enemy stats, dialogue trees
Your screen capture feedback loop catches bugs and compares the running remake to the reference walkthrough

Stage 5 — Automated Validation
Run the original walkthrough video alongside your remake and have the AI visually compare them scene by scene, flagging discrepancies. The walkthrough becomes an automated test suite.
Recommended AI Stack
RoleBest ChoiceBudget AlternativeVideo analysisGemini 2.0 ProGPT-4oKnowledge assemblyKimi (Moonshot)Claude SonnetArchitecture decisionsClaude OpusClaude SonnetBulk code generationKimiKimiImage generationFlux / MidjourneyStable Diffusion localMusic recreationSunoUdioGame engineGodotGodot
Cost Estimates by Year
These assume 2 Opus instances + 8 Kimi instances running the full remake pipeline. Costs fall each year as model pricing drops and capabilities improve.
Lands of Lore 2 (Medium complexity)

Part 3: What Would Actually Help Right Now
If anyone in the VOGONS community wants to start preparing for this kind of project, the most useful contributions would be:
Complete walkthrough recordings — multiple playthroughs of all three games, ideally showing all areas including optional content, all enemy types, and all UI states. Higher resolution recordings are better. Slow and methodical is better than speedrunning.
Structured GameFAQs knowledge — the existing guides are great but pulling them into a clean structured format (all items in one list, all enemies with stats, all area connections mapped) would dramatically reduce AI processing costs.
Speedrunner knowledge — speedrun communities often have detailed understanding of game mechanics, damage formulas, and engine quirks that never made it into any FAQ. This is gold for reconstruction.
Technical documentation — if anyone has already done RE work on any of the three games, even partial notes about executable structure or graphics calls, that is incredibly valuable.

Final Thoughts
The most interesting thing about this whole approach is that it flips the usual preservation workflow. Instead of trying to make old code run on new hardware, you reconstruct the experience from observation and rebuild it on modern foundations. The original executable becomes almost irrelevant.
This is not science fiction — the tools to do this exist today. The main limiting factor right now is cost, and that is falling fast. By 2028 this will be genuinely accessible to enthusiast communities with modest budgets.
If anyone is already working on something similar, or has thoughts on the approach, I would love to hear about it.

Note: All cost estimates are speculative and based on current API pricing trends. Actual costs will vary significantly based on project scope, how efficiently the pipeline is designed, and how model pricing evolves. Legal considerations around game reconstruction vary by jurisdiction and circumstance — do your own research before distributing anything.

Pulls some spitt balling docs from ais.

Reply 1 of 11, by jmarsh

User metadata
Rank Oldbie
Rank
Oldbie

This is just AI advertising itself.

Reply 2 of 11, by feda

User metadata
Rank Member
Rank
Member
1235 wrote on Yesterday, 18:38:

This post covers what I learned, what seems realistic, and a rough guide for anyone who wants to try this themselves.

Why don't you lead by example and try it yourself? And show us the results (and cost) 🤣

Also, LoL2 is 3D accelerated so I believe it can already be forced to render in larger resolutions using wrappers, no AI necessary.

Reply 3 of 11, by leileilol

User metadata
Rank l33t++
Rank
l33t++

fuck no

apsosig.png
long live PCem

Reply 4 of 11, by twiz11

User metadata
Rank Oldbie
Rank
Oldbie
jmarsh wrote on Yesterday, 19:45:

This is just AI advertising itself.

I believe its only a matter of time before we are dumb enough to believe ai how to fix our games, no doubt the ai trained itself off vogons

Reply 5 of 11, by megatron-uk

User metadata
Rank l33t
Rank
l33t

FFS. Just no.

My collection database and technical wiki:
https://www.target-earth.net

Reply 6 of 11, by 1235

User metadata
Rank Newbie
Rank
Newbie

Has anyone dug into LoL2's executable for native resolution support?
I know wrappers like dgVoodoo2 can already upscale it, but I'm curious whether anyone has looked at patching the executable itself for native higher res output. I'm planning to set up a Wine DDraw logging loop to see what resolution calls it's actually making, and pair that with Ghidra to find where things are hardcoded.
Before I go down that rabbit hole — has this been attempted before? Would save me some time knowing what dead ends already exist.
Also curious whether LoL3 has any hidden resolution arguments — being a year newer I'd expect it to be more flexible.

Reply to jmarsh ("This is just AI advertising itself"):

Fair enough — the first post read more like a pitch than a question. I got carried away writing it up before actually doing anything. Point taken.

Reply to feda ("Why don't you lead by example"):

You're completely right and that's exactly what I should have done before posting. Lesson learned — come back when I have actual DDraw logs and something broke.
On LoL2 being 3D accelerated — do you know which wrapper gives the cleanest result? dgVoodoo2 or something else? Saves me reinventing the wheel if the wrapper route is already solid.

Reply to leileilol ("fuck no / long live PCem"):

PCem accuracy is hard to argue with. For pure preservation that's obviously the right answer. My interest is more in running it natively on modern hardware without the emulation overhead, but I get that's a different goal than what PCem is for.

Reply to twiz11 ("only a matter of time before we believe AI how to fix our games"):

Skepticism is fair — there's a lot of AI hype that goes nowhere. I'm not interested in taking AI's word for anything, more using it as a tool in a loop where I can see exactly what it's doing and whether it's actually working. But yeah, the first post was too much talking and not enough doing.

Reply to megatron-uk ("FFS. Just no."):

Noted. I'll come back if I actually have results worth sharing.

Reply 7 of 11, by jmarsh

User metadata
Rank Oldbie
Rank
Oldbie
1235 wrote on Today, 08:25:

I'm not interested in taking AI's word for anything, more using it as a tool in a loop where I can see exactly what it's doing and whether it's actually working.

You're using AI to write these posts, and it's really obvious. The AI couldn't even tell the difference between leileilol's post and their tagline.

Reply 9 of 11, by NeoG_

User metadata
Rank Member
Rank
Member

I vote to banhammer

98/DOS Rig: BabyAT AladdinV, K6-2+/550, V3 2000, 128MB PC100, 20GB HDD, 128GB SD2IDE, SB Live!, SB16-SCSI, PicoGUS, WP32 McCake, iNFRA CD, ZIP100
XP Rig: Lian Li PC-10 ATX, Gigabyte X38-DQ6, Core2Duo E6850, ATi HD5870, 2GB DDR2, 2TB HDD, X-Fi XtremeGamer

Reply 10 of 11, by Joseph_Joestar

User metadata
Rank l33t++
Rank
l33t++

What's up with the recent invasion of AI bots on Vogons? Just a few days ago we had one that was necrobumping old threads and telling people to upgrade from Win98 because of "licensing issues" or something.

We don't need AI slop here. Kindly go and peddle your wares elsewhere.

PC#1: Pentium MMX 166 / Soyo SY-5BT / S3 Trio64V+ / Voodoo1 / YMF719 / AWE64 Gold / SC-155
PC#2: AthlonXP 2100+ / ECS K7VTA3 / Voodoo3 / Audigy2 / Vortex2
PC#3: Core 2 Duo E8600 / Foxconn P35AX-S / X800 / Audigy2 ZS
PC#4: i5-3570K / MSI Z77A-G43 / GTX 980Ti / X-Fi Titanium