Parallel Wave Function Collapse

The Wave Function Collapse algorithm models the behavior of wave functions in quantum systems. A wave function represents a quantum system's state by encompassing all possible finite states at once, known as 'superposition.' The superposition collapses and converges to a single definite state only when an observation is made.

Goal

The Wave Function Collapse algorithm is widely known for its impressive capacity to simulate quantum systems. However, due to its significant computational demands, it is subject to input size restrictions and can cause extensive delays. To address these obstacles, our team seeks to enhance the algorithm's performance by integrating parallelism to concurrently execute specific tasks.

Components

  1. Image Processor
  1. WFC Core

Image Processor

The Image Processor is a crucial part of our system that prepares images for use in the Wave Function Collapse algorithm. It helps generate important information like sample IDs and their relationships with neighboring samples.

This process involves three main steps:

  1. Collecting all NxN patterns
  1. Setting up frequency rules
  1. Determining adjacency rules

Pattern/Sample Collection

Collecting patterns is done by stepping over every possible pixel and grabbing all N-1 pixels east-ward and south-ward, from this we get the raw patterns.

// src/image_reader.rs
pub fn sample(&self, n: i32) -> Vec<Sample> { ... }

Frequency Rule

Working with raw patterns can be problematic, which is why we need to remove duplicates. Luckily, this step serves a dual purpose as we can also keep track of the number of duplicates and use that information in our frequency rule. So, deduplication not only eliminates redundancy but also provides valuable information that we can leverage in the frequency rule.

// src/model.rs
// Calculate the number of times each unique sample appears
let freq_map: HashMap<Sample, i32> = unprocessed_samples.iter().fold(
		HashMap::<Sample, i32>::new(),
		|mut freq_map, sample| {
				*freq_map.entry(sample.clone()).or_insert(0) += 1;
				freq_map
		},
);
Notice how the two patterns on the left are the same. The frequency of this pattern will be 2.

Adjacency Rule

The final step is generating the adjacency rules. To accomplish this, we compare each sample to all other samples, including itself, and check if their N-1 pixels in their respective direction match. For instance, we would check if the bottom N-1 pixels of sample A match the top N-1 pixels of sample B. By doing this for every sample, we are able to create the necessary adjacency rules.

// src/model.rs
// Create adjacency rules
for s1 in 0..samples.len() {
    for s2 in 0..samples.len() {
        for direction in &ALL_DIRECTIONS {
             if samples[s1].compatible(&samples[s2], *direction) {
                adjacency_rules[s1][direction.to_idx()].insert(s2);
             }
        }
    }
}
An image showing the left-right compatibility between two NxN patterns.

WFC Core Overview

The core algorithm starts by creating a grid of cells, with each cell in a superposition state, meaning it can be any of the NxN patterns. This concept is similar to how a Sudoku puzzle starts, with each blank cell initially holding all possible numbers from 1-9. As you observe the neighbors in the same row, column, and box, the range of possibilities for each cell gets narrowed down. Eventually, you reach a point where only one number can fit in a particular cell, leading to convergence. Similarly, in the Wave Function Collapse algorithm, observing the neighbors' patterns gradually collapses the superposition state until a final configuration is achieved.

The Wave Function Collapse algorithm operates on a simple "search and kill" loop. Here is a high-level pseudo-code representation of the main loop:

while not_collapsed:
	// Choose the next cell with the lowest 
	// entropy which hasn't been collapsed yet
	next_coord = find_cell_with_lowest_entropy()
	
	// Collapse the chosen cell
	collapse_cell_at(next_coord)
	
	// Propagate the effects
	propagate()
	remaining_uncollapsed_cells -= 1;
}

Entropy

Entropy measures uncertainty, and it's essential in the Wave Function Collapse algorithm. It refers to the number of possibilities that remain in a particular cell (frequency calculations are involved but omitted here). The higher the entropy, the more possibilities there are, and the greater the uncertainty.

Knowing about entropy helps us choose which cell to collapse next for faster convergence. Collapsing cells with low entropy reduces the chance of encountering an impossible output, which happens when a cell has zero potential to be meaningful. Therefore, we need to be careful when selecting cells to collapse and avoid contradictions as they make the output invalid.

Cell Collapse

Once a cell is selected for observation and collapse, we choose a compatible pattern with frequency hints in mind in a random manner. This ensures that the output characteristics of the Wave Function Collapse algorithm accurately reflect the input.

Propagation

The propagation phase is where the algorithm's complexity begins. During the propagation phase, the algorithm updates neighboring cells based on the removal of a specific possibility from the origin point. It's like a 2D domino effect, where changes propagate like a wave throughout the grid. In other words, the removal of a possibility in one cell affects its neighboring cells and triggers further updates.

The chaos is cranked to 11 because when a neighboring cell is updated and some patterns are no longer possible, it enqueues a new removal update, triggering another round of propagation. This process can escalate quickly, causing a chain reaction of updates and removals that can be difficult to predict.

Parallelization

Initially, we had planned to parallelize the propagation process, but it proved to be too complicated. Instead, we decided to parallelize the collapse of subsections of the output. Our approach is straightforward: we forcefully collapse N-1 cells in the middle column and row of the grid, triggering propagation updates for the entire grid. We can then isolate and process each subsection of the grid, knowing the collapsed cells in between. This way, we can handle each subsection in parallel without worrying about the output's seamlessness.

One surprising advantage of this strategy is that it lowers the chance of contradiction, something which we only found out after the implementation, and it makes perfect sense in hindsight because each grid is much smaller meaning it has less overall entropy.

Contradiction, contradiction, contradiction!

Contradiction is a common issue with the Wave Function Collapse algorithm and can be challenging to remedy. There are various techniques to recover from a contradiction, but they are beyond the scope of our project.

As this project is for a parallel computing class, we opted for a simple remedy of retrying in parallel. For each grid subsection, we assign four workers to compute their unique answers. We then select any successful result at random.

Extra

Here is how N can affect the output.

N = 6
N = 3
N = 3
N = 5
N = 3
N = 2
N = 6