How to procedurally generate Zelda dungeons

Yesterday I gave a microtalk at one of the incredible Unknown Worlds Entertainment‘s postmortem get-togethers. The talk is appropriately titled How to procedurally generate Zelda dungeons. I hope to see the recordings online soon. Until then here are my slides and slide notes:

  1. [Introduction]
  2. Zelda dungeons are awesome because they combine fighting and puzzles. Let’s focus on the puzzles.
  3. The dungeon puzzle is really just obstacles and collectable items to overcome the obstacles.
  4. Example: You enter the dungeon in the southmost room. In the next room you’ll see four exits, one is blocked by a door with a keyhole. You’ll need to go east to find the key. You’ll also notice another door there that unlocks if you can light the torch next to it. To do this you must solve all of the west/north part of the dungeon. In the northwestmost room you’ll find the satchel with ember seeds. You’ll need to go all the way back to light the torch and continue eastwards to beat the dungeon’s boss. Note that exploring the dungeon is highly nonlinear and requires lots of backtracking.
  5. Puzzle core graphThe puzzle itself can be represented as a graph. I will explain how to generate graphs like this one. Being able to do so is exciting, because it solves a classic problem in game design, i.e.
  6. The conflict between progammers and designers, generating content for novelty and replay value vs. creating content for specific purpose and artistic vision, unlimited new stories vs. one good story. You can have both. Generating complex dungeon puzzles is creating good new stories.
  7. I’m using a genetic algorithm to grow these puzzle graphs. I’m guiding the evolution to favor purposive puzzles. The genetic algorithm needs to know the following three things:
  8. Graph encoding to genomeEncoding. You can encode a graph into a genome by using a list of edges referring to their nodes by number. Obstacle edges need two nodes, item edges only one. Each edge becomes a gene.
  9. Mutation. Meaningful operations on the genes are for example changing node numbers, items or obstacles. Operations on the genome include adding and removing genes, of course.
  10. Puzzle optimizationEvaluation. Decode the genome grown by the genetic algorithm into a graph. Run a brute force crawler on the graph. Define objectives like no dead ends, long shortest path, etc. to guide the evolution. Use multi-objective optimization.
  11. [Result example]
  12. [End]
  13. [Contact information]

Note that these puzzles are not limited to games like Zelda. They can be found in pretty much any game involving puzzles and are also the heart of every classic adventure. Of course there are lots of details that still need to be explained, but these microtalks are limited to five minutes. If you are interested in the topic, please let me know by leaving a comment!

How to procedurally generate Zelda dungeons (PDF)

Infinite Zelda project

Some years ago I started working on a project I call Infinite Zelda. It’s a lot like Infinite Mario Bros! just for The legend of Zelda. It’s basically a level generator. The goal is to create fun-to-play zelda-like dungeons.

A zelda-like dungeon consists of connected rooms with enemies, keys and doors, bombs and cracks, switches, gaps, one-way paths and what have you. The player has a wealth of items at his disposal to overcome the challenges of the dungeon, beat the boss and eventually rescue the princess. Yay!

Generating such a dungeon is difficult. Especially a fun-to-play one. Even if you leave out the story-telling. Let’s focus on the puzzle aspects. There are constraints that must be met to prevent player frustration:

  1. The dungeon must be solvable. There must be a way.
  2. The player must not get stuck. There must not be a total dead-end.
  3. Every item and obstacle should be essential. You don’t want more keys than doors.

In a series of posts I’ll explain what it takes to generate such dungeons. I’ll break down the challenges in detail and present my approach. I’ll show how multiple layers of abstraction help modelling each part. Over the course it will become clear how all of this is related to genetic algorithms, abstract syntax tree interpreters and formal verification.

Efficient Neural Network Pruning during Neuro-Evolution

A paper based on my student research project got published at the International Joint Conference on Neural Networks, IJCNN 2009. The conference is held jointly by the International Neural Network Society INNS and the IEEE Computational Intelligence Society – the two leading professional organizations for researchers working in neural networks. The paper can be found at the IEEE CS digital library. The abstract reads as follows:

In this article we present a new method for the pruning of unnecessary connections from neural networks created by an evolutionary algorithm (neuro-evolution). Pruning not only decreases the complexity of the network but also improves the numerical stability of the parameter optimisation process. We show results from experiments where connection pruning is incorporated into EANT2, an evolutionary reinforcement learning algorithm for both the topology and parameters of neural networks. By analysing data from the evolutionary optimisation process that determines the network’s parameters, candidate connections for removal are identified without the need for extensive additional calculations.

Efficient Neural Network Pruning during Neuro-Evolution (PDF)

Studienarbeit

Der Einfluss neuronaler Topologien auf die Effizienz der Parameteroptimierung

Die Optimierung großer Neuronaler Netze ist im allgemeinen schwierig. Gründe dafür sind die große Anzahl einstellbarer Parameter (Curse of Dimensionality) und die oft schlechte Kondition des Optimierungsproblems. Verfahren zur Parameteroptimierung dieser Netze sind daher manchmal ineffektiv, häufig aber weniger effizient als bei kleineren, gut konditionierten Problemen. Ziel der Arbeit ist es, ein Verfahren zu entwickeln mit dem die Anzahl der Parameter reduziert und die Kondition verbessert werden kann, ohne den Fehler des Neuronalen Netzes zu verschlechtern.

Wir erzeugen Neuronale Netze mit EANT2. Die Netztopologie wird dabei durch einen Genetischen Algorithmus erstellt, die Parameter werden mit CMA-ES optimiert. EANT2 wurde bereits erfolgreich auf verschiedene Problemstellungen, wie Visual Servoing und Edge/Corner Detection angewandt. Unsere Versuche mit verschiedenen Problemstellungen haben gezeigt, dass die Neuronalen Netze schnell auf eine Größe von etwa 100 Neuronenverbindungen anwachsen, von denen einige unnotig sind, und dass die Netze schlecht konditioniert sind.

Der Einfluss neuronaler Topologien auf die Effizienz der Parameteroptimierung (PDF)