
There’s a particular kind of gamer who will tell you modern RPGs are better than ever. And there’s another kind who insists the genre peaked in 1999 and has been slowly suffocating since.
The truth – as usual – is more interesting than either position.
Because if you look at what happened to RPGs over the last decade without ideological blinders, you see something rare in the entertainment industry: a genre that genuinely died, and then genuinely came back. Not as nostalgia bait. Not as a calculated reboot. As something that actually learned from its failures.
That story is worth telling properly.

Let me be honest about something upfront: the phrase “real RPG” is a minefield. Every gamer has their own definition, and most of those definitions conveniently place their favourite game at the centre.
So let me be specific instead of abstract.
Baldur’s Gate 2 (2000) featured companions who would leave your party if your moral choices conflicted with their personal convictions. Not as a punishment – as a consequence. Minsc would follow you cheerfully into genocide if you let him. Aerie would not. The game understood that characters with actual beliefs create friction, and that friction creates drama.
Planescape: Torment (1999) asked its central question – “What can change the nature of a man?” – and never answered it. It gave you 40 hours of play and let you arrive at your own conclusion. The protagonist had no fixed identity. Every death changed something. The game’s entire mechanical and narrative structure was built around philosophical inquiry. That’s not common in any medium, let alone games.
Morrowind (2002) dropped you into a world and explained nothing. You received a letter, a name, and a city. You had to ask NPCs for directions. You could get lost for two hours and discover something more interesting than your original destination. The world didn’t scale to you – you scaled to it.
These aren’t just good games. They represent a specific design philosophy: the game is a system with its own internal logic, and your job is to learn and engage with that system. The world doesn’t bend to accommodate you. You adapt, grow, and eventually – if you pay attention – master it.
That philosophy is what eroded. And it didn’t erode all at once.

The decline wasn’t a sudden event. It was a gradual accommodation to a perceived audience – an audience that the evidence suggests was largely imaginary.
Oblivion (2006) was the first significant step. Compared to Morrowind, it added quest markers (you now had an arrow pointing to your objective), introduced level scaling (enemies grew stronger as you did, eliminating the sense of dangerous zones), and streamlined the skill system. Individual changes that each seemed reasonable. Collectively, they represented a shift in philosophy: the game should be accessible, not demanding.
The key problem with level scaling deserves attention because it’s rarely discussed precisely enough. In Morrowind, you were genuinely afraid of cliff racers in certain areas early in the game. That fear created topography – some places were dangerous, others were safe, and the journey from one to the other represented real growth. When you finally fought your way through a zone that had previously destroyed you, the victory was earned.
Level scaling eliminated that topography. In Oblivion, you were always “appropriately challenged.” Which sounds fair until you realise it means you never feel genuinely powerful. You never feel the growth. You feel like you’re on a treadmill that adjusts its speed to keep you permanently breathless.
Skyrim (2011) continued this trajectory – removing character classes, simplifying magic, reducing dialogue choices – and sold approximately 30 million copies. Bethesda drew what seemed like an obvious conclusion: simplification equals sales.
This conclusion was not wrong, exactly. But it was incomplete.
Dragon Age: Inquisition (2014) became the symbol of a different kind of problem. Here was BioWare – the studio behind Baldur’s Gate, Knights of the Old Republic, Dragon Age: Origins – making an open world. Which sounds promising. Open world means space, freedom, possibility.
What it meant in practice: 50+ map icons. Repeatable enemy camps. Identical fortresses. The same “collect 10 of X” objectives duplicated across every region. The mechanical design communicated clearly: content is quantity, not quality.
Compare the side quest structure of Dragon Age: Inquisition to Baldur’s Gate 2, where even minor errands – retrieve a book, find a missing person – could spiral into investigation, moral complexity, and choices with lasting consequences. The difference isn’t budget. BioWare had more resources in 2014 than Black Isle Studios had in 2000. The difference is intent.
Dragon Age: Inquisition won Game of the Year awards from multiple publications. It sold millions. It confirmed for the industry’s executives what they already suspected: depth is optional. Players want content. They’ll mistake quantity for quality.

What makes this period genuinely interesting is that it wasn’t monolithic. While the major studios were optimising for accessibility and content volume, something else was happening in parallel.
The Witcher 3: Wild Hunt (2015) demonstrated that open worlds and meaningful content aren’t contradictory. CD Projekt RED built a game where even side quests were complete narrative experiences. The quest involving the nobleman turned into a pig – what sounds absurd becomes unexpectedly poignant. The Bloody Baron questline is better written than most prestige television drama from the same year.
The Witcher 3 is worth examining carefully because it didn’t succeed by rejecting accessibility. It’s not a punishing game. The combat is manageable, the map is filled with markers, the game is generous with explanation and guidance. What it did differently was refuse to treat those markers as substitutes for content. Every icon on the map represented something designed with intent.
Divinity: Original Sin 2 (2017) from Belgian studio Larian went in a different direction – not toward accessibility, but toward tactical depth that the genre had largely abandoned. Turn-based combat with environmental interactions, positioning, and consequences for every decision. The game didn’t apologise for its complexity.
Its commercial success was significant: players who had grown up on Baldur’s Gate existed in sufficient numbers to make a deeply complex isometric RPG financially viable. Larian had found an audience the industry had declared extinct.
Pillars of Eternity (2015) proved the same point differently. Obsidian Entertainment crowdfunded a conscious return to the 1990s isometric RPG – dense text, complex systems, meaningful choices. The campaign raised nearly four million dollars. Players paid in advance for exactly the kind of game that major studios had decided nobody wanted.
The message was clear, even if the industry wasn’t listening: there was a substantial audience for complexity. It hadn’t gone anywhere. It had simply been given simple games for years and had largely stopped expecting anything else.

Then came two games that changed the conversation entirely.
Elden Ring (2022) is worth examining not just as a great game, but as a cultural event. FromSoftware had been building their design philosophy for over a decade – no tutorials, no quest markers, enemies that will kill you without hesitation, a world that doesn’t scale to accommodate you. In genre terms, this is maximally hostile to the accessibility trends of the previous decade.
Elden Ring sold over 20 million copies.
Think about what that means. A game with no hand-holding, no map icons guiding you to objectives, no difficulty modes, no “you got this!” encouragement pop-ups – became one of the best-selling games of its year, from a studio that had never previously reached that commercial scale.
The game gave players something they apparently hadn’t experienced in a long time: the authentic feeling of a world that doesn’t revolve around them. The sensation of being genuinely small and genuinely threatened. And consequently – when you defeat a boss on your fifteenth attempt – a genuine, unqualified victory. Not a participation trophy. Not a difficulty-adjusted experience calibrated to keep you comfortable. A real win.
The industry had spent fifteen years assuming players wanted to feel powerful. Elden Ring suggested many of them actually wanted to feel capable – which requires first being challenged.
Baldur’s Gate 3 (2023) completed the argument.
Larian Studios built what the industry had repeatedly declared commercially non-viable: a fully realised isometric RPG with turn-based combat, hundreds of hours of content, and a level of reactive world-design that players are still discovering months after release. A game where your choices genuinely alter the course of events. Where NPCs remember what you did. Where you can fail quests, damage relationships, make catastrophically wrong decisions – and the game continues with the consequences, because that’s what choices mean.
Baldur’s Gate 3 won over 200 Game of the Year awards across publications. It sold millions of copies – far beyond what any publisher would have projected for this type of game. It became a cultural moment.
What it proved was simple but important: players are not stupider than they were in 2000. They didn’t evolve away from complexity. They were given simple games for years and learned to expect simplicity. Given the alternative, many of them chose it immediately.
There’s a tempting narrative here – old games good, new games bad, complexity always better than accessibility – and I want to resist it, because it’s too clean.
Skyrim is not a failure. It brought millions of people into a genre that then some of them explored further, eventually arriving at games like Elden Ring and Baldur’s Gate 3. The accessibility pipeline has real value.
Dragon Age: Inquisition is not a disaster. It has genuine merits – the companion writing is often excellent, the world has moments of real beauty. Its failure is specifically the disconnect between the quality of its narrative ambitions and the mechanical banality of its open world.
The actual lesson isn’t “simple games are bad.” It’s that studios used simplification as a replacement for depth rather than an entry point to it. They simplified the wrong things. The goal of accessibility should be to reduce unnecessary friction – confusing UI, opaque systems, poor onboarding – not to eliminate the challenge and consequence that make the genre meaningful.
The studios that understood this distinction – Larian, FromSoftware, CD Projekt RED, Obsidian – are currently making the most acclaimed RPGs in the genre’s history.
The studios still treating 40 map icons as content design are losing their audiences slowly and, for the most part, don’t yet understand why.
RPGs haven’t simply returned to where they were. The best contemporary RPGs – Baldur’s Gate 3, Elden Ring, Disco Elysium – are more technically sophisticated, more narratively ambitious, and more mechanically refined than their predecessors. They’ve inherited the design philosophy of the 1990s and applied it with tools and budgets those studios never had.

What changed is that the industry briefly convinced itself that players didn’t want what they actually wanted. A decade of commercial data has corrected that misunderstanding.
The genre is not just alive. It may be at its peak.
It just took a fifteen-year detour through map icons to get here.
If this sparked any thoughts – what do you think the most significant turning point was? The Witcher 3 demonstrating quality open worlds were possible, or Elden Ring proving that difficulty had commercial appeal?






