While it’s true that graphics for older console games were designed with the “fuzziness” of CRT televisions in mind, I have a problem with the assertion that experiencing those games’ graphics with sharp pixel boundaries is ahistorical, especially if we’re talking – as we often are – about the Super Nintendo generation of consoles.
The fact of the matter is that console emulators for PC had become commonplace by the time fourth generation consoles like the Super Nintendo rolled around, and that’s how many folks played those games. Heck, a lot of games that are now regarded as classics of the era – Final Fantasy V, Seiken Densetsu 3, Terranigma, etc. – were played exclusively via PC emulator in North America when they were current, owing to the fact that they didn’t receive official releases in that region until much later.
Sure, seeing the graphics with the razor-sharp pixels of PC monitors isn’t how those games were “meant” to be experienced, but it’s not a matter of later generations misinterpreting them due to faulty ports. Plenty of folks were experiencing them that way even when they were new – and frankly, holding up the developer-intended experience as more historically authentic than the alternatives, even when those alternatives were demonstrably as or more common (at least for certain games and in in certain regions) has some really weird implications for how we think of the history of media.
(To anticipate the obvious objection, yes, PC monitors were typically also CRTs during the fourth console generation, but they had pixel sharpness comparable to that of modern LCD monitors owing to their higher scan rates. This was a sufficiently recognised issue that some emulators that were contemporary with fourth-gen consoles themselves implemented CRT filters!)
1K notes ·
View notes
Statistics
We looked inside some of the posts by
the-great-zuul
and here's what we found interesting.