Solver for Advent of Code 2023 puzzles. https://adventofcode.com/2023/
Go to file
Stefan Müller 16e7528b34 Updated day 12 WIP performance refactor
- Added TValidationsToBlockAssignments to replace the loops in
  TConditionRecord.GenerateBlockAssignments
2024-11-19 18:14:14 +01:00
solvers Updated day 12 WIP performance refactor 2024-11-19 18:14:14 +01:00
tests Added binomial coefficient calculation 2024-11-09 17:59:56 +01:00
.gitignore Removed FPCUnit ini settings file 2023-12-04 22:00:30 +01:00
AdventOfCode.lpi Added binomial coefficient calculation 2024-11-09 17:59:56 +01:00
AdventOfCode.lpr Added solution for "Day 25: Snowverload", part 1 2024-06-04 17:14:24 +02:00
LICENSE Initial commit 2023-12-01 23:03:05 +01:00
README.md Updated favorites in readme 2024-07-03 20:42:40 +02:00
UBigInt.pas Added bisection variant for integers instead of intervals 2024-05-26 18:59:47 +02:00
UBinomialCoefficients.pas Added global BinomialCoefficients instance 2024-11-17 17:47:43 +01:00
UCommon.pas Updated day 12 WIP performance refactor 2024-11-17 23:55:25 +01:00
UIntervals.pas Added solution for "Day 19: Aplenty", part 2 2023-12-20 19:25:21 +01:00
UMultiIndexEnumerator.pas Updated day 12 WIP performance refactor 2024-11-19 18:14:14 +01:00
UNumberTheory.pas Removed unused unit includes, left from older changes 2024-06-25 22:23:58 +02:00
UPolynomial.pas Added TBigIntPolynomial.ScaleVariableByPowerOfTwo 2024-05-26 16:59:16 +02:00
UPolynomialRoots.pas Fixed comments 2024-05-26 19:58:13 +02:00
USolver.pas Added support for multiple data root paths 2024-02-22 22:26:31 +01:00

README.md

🎄 Advent of Code 2023

Solver for Advent of Code 2023 puzzles.

This is a single command line application for all puzzles written in FreePascal with Lazarus 2.2.6 and compiled with FPC 3.2.2.

Puzzle Input

This project does not contain the puzzle or example inputs as per the copyright notice of Advent of Code. In order to run the compiled application, the puzzle inputs have to be downloaded from the Advent of Code 2023 puzzle pages, and placed as text files into the bin\data directory, e.g. bin\data\cube_conundrum.txt or bin\data\example\cube_conundrum.txt. The application will output an error message with details, if it cannot find an input file.

Tests

On day 3, I introduced unit tests to help troubleshoot issues and prevent regressions while I kept iterating over the solver class framework. These tests cover the provided example solutions and occasional partial data tests whenever I felt the need for it.

I also added tests for the full puzzle data once I found the solution, but these tests are not public to not spoil the Advent of Code puzzles.

My Favorites

While I think all the puzzles were a lot of fun, some of them became my favorites, marked with a star , because I enjoyed the puzzle itself or was particularly content with my solution algorithm. These are day 9, day 18, and day 24.

Solutions

Day 1: Trebuchet?!

🔎 Puzzle: https://adventofcode.com/2023/day/1, Solver: UTrebuchet.pas

My solution parses each line once forward for the right number, and once backward for the left number for both parts of the puzzle.

Day 2: Cube Conundrum

🔎 Puzzle: https://adventofcode.com/2023/day/2, Solver: UCubeConundrum.pas

That one seemed pretty straight forward. For each line, the solution immediately sums up games that fulfill the maxima and finds the maximum of each color.

Day 3: Gear Ratios

🔎 Puzzle: https://adventofcode.com/2023/day/3, Solver: UGearRatios.pas

This was the first puzzle where I had to implement a solver processing multiple lines at once instead of one by one. Here, it also needs the lines directly before and after the one being processed.

The algorithm processes the numbers in the middle line and looks for additional symbols in the lines before and after. The tricky part was to correctly track the data needed for processing of each line and discarding it in time, without resorting to reading all data in before processing.

Day 4: Scratchcards

🔎 Puzzle: https://adventofcode.com/2023/day/4, Solver: UScratchCards.pas

For part 1, the algorithm simply matches winning numbers against numbers we have, and multiplies the current line result by two for every match (except the first).

For part 2, there is a list of numbers of card copies for the upcoming cards, where the list index is shifted to be always relative to the current line. This works because the copies are always applied contiguously over upcoming cards. Once a card has been processed, its copy value is deleted from the beginning of the list.

Day 5: If You Give A Seed A Fertilizer

🔎 Puzzle: https://adventofcode.com/2023/day/5, Solver: UGiveSeedFertilizer.pas

Originally, I had implemented this by reading all data in first, constructing a list of the seven mappings, each containing a list of mapping ranges. I rewrote this when I realized that the conversion can be done line-by-line by maintaining separate lists of "unconverted" and "converted" values. Each mapping range is applied to all unconverted values, and if one matches it is converted and moved into the list of converted values. At the end of a map all converted values are moved back into the unconverted list. Unconverted values simply remain unconverted for the next map.

For part 2, it is not necessary (and not feasible) to convert the input ranges into individual values to run through the existing algorithm. Instead I modified the algorithm to run on ranges of input directly. This means that a successful conversion can split a range in up to three parts, where one is moved into the "converted" pile, while the others remain unconverted.

Day 6: Wait For It

🔎 Puzzle: https://adventofcode.com/2023/day/6, Solver: UWaitForIt.pas

This one I solved by calculating the roots of the function f(x) = -time ^2 * x + distance and determining the distance between them. Part 2 was the first puzzle where my solver required 64-bit integers for the calculations.

Day 7: Camel Cards

🔎 Puzzle: https://adventofcode.com/2023/day/7, Solver: UCamelCards.pas

The first puzzle that I could not solve line-by-line (day 6 doesn't count). For this one I store all the card hands and assign them a "type", e.g. "four of a kind", when processing them by counting the different card values in a hand. The rest of work is done in a custom compare function. When all data is processed, I just use the compare function to sort all card hands, and then multiply the resulting indices with the bids.

For part 2, each card hands gets a "joker type" analogous to the "type", for which the number of joker cards is added to the highest number of a different card type.

Day 8: Haunted Wasteland

🔎 Puzzle: https://adventofcode.com/2023/day/8, Solver: UHauntedWasteland.pas

Again a puzzle where I had to read in all of the data before starting the algorithm. It proved difficult to verify parts of the algorithm by hand, but part 1 was still pretty straight forward.

Part 2 was a bit sneaky. This is the first puzzle where the result is outside the 32-bit unsigned integer range. And it is solvable only because each starting node leads into a loop with one of the target nodes, where the length of the loop is a multiple of the length of the sequence of instructions. With this knowledge, one can stop traversing the network once each target node has been reached and calculate the result directly.

Day 9: Mirage Maintenance

🔎 Puzzle: https://adventofcode.com/2023/day/9, Solver: UMirageMaintenance.pas

This one I enjoyed the most so far. The process that is described in the puzzle, constructing a series of differences from the previous series, and then reverting the process to extend the series, is equivalent to finding a polynomial with maximum degree of n - 1, where the original series are n equidistant values of the polynomial.

So instead of using the outlined "brute force" method, I used Lagrange polynomials with x1 = 0, x2 = 1, ..., xn = n - 1 evaluated at x = n (for part 1) and x = -1 (for part 2) to find the function values for the extrapolated "points". Conveniently, the Lagrange polynomials can be precalculated for the whole puzzle (with some tricks to not run over Int64 limits) because they only depend on x values, which remain constant. This makes the calculation of the extrapolated values quite easy.

A nice explanation of the Lagrange method can be found here http://bueler.github.io/M310F11/polybasics.pdf.

Day 10: Pipe Maze

🔎 Puzzle: https://adventofcode.com/2023/day/10, Solver: UPipeMaze.pas

The input data is such that there are only two pipes pointing to S, so finding the loop is only a matter of following the chars as instructed. It seems best to read in the full input before trying to traverse the maze, I did not see another option. The length of the loop is always even, so my algorithm just follows the path until it is back to S and counts only every other step.

For part 2, I tracked tiles that are "left" and "right" of the path the algorithm took, and implemented a little flood-fill algorithm that tries to fill the area in between the "left" tiles and the "right" tiles, while counting them. The "outside" group is the one where the flood-fill touches the edge of the map and is simply ignored.

Day 11: Cosmic Expansion

🔎 Puzzle: https://adventofcode.com/2023/day/11, Solver: UCosmicExpansion.pas

While parsing the input, the solver tracks coordinates of each galaxy, and for each row and column a 1 if it is empty and a 0 if not. At the end we sum for each pair of galaxies the values for each row and column between their coordinates +1 to get the sum of their (Manhattan) distances.

This approach was trivial to adapt for part 2, since all that was needed was another factor that had to be multiplied with the values tracked for the rows and columns before applying the +1.

Day 13: Point of Incidence

🔎 Puzzle: https://adventofcode.com/2023/day/13, Solver: UPointOfIncidence.pas

While going through each line, the algorithm keeps updating two lists of mirror candidates, one for horizontal and one for vertical mirrors. For horizontal mirrors, a new candidate is added whenever two consecutive lines are identical. After it is added, new lines are each compared against the potentially mirrored earlier line, until the candidate is discarded or the last line successfully mirrored.

For vertical mirrors, all candidates are added during processing of the first line, based on whether they mirror the first line or not. While processing further lines, each candidates is verified against each line or discarded.

To solve part 2, each candidate is allowed one character switch, and tracks whether that switch happened or not to successfully mirror all processed lines. If a second character switch is required or no switch had occurred at the end, the candidate is discarded. By setting this tracker as if a switch had already happened even for new candidates, both parts of the puzzle can be solved simultaneously.

Day 14: Parabolic Reflector Dish

🔎 Puzzle: https://adventofcode.com/2023/day/14, Solver: UParabolicReflectorDish.pas

I spent too much time on this one. I had originally implemented a relatively naive algorithm that would do the tilting of the platform by operating directly the string list, swapping out round rocks as it went, which seemed quite slow.

So I reimplemented the whole thing with proper data structures consisting of lists of non-empty intervals between cube-shaped rocks, and lists of rows and columns of rounded rocks, between which the algorithm would alternate, to facilitate a faster computation without string manipulation. This improved the performance of the algorithm, but unfortunately not as much as I had expected.

An essential revelation to make any algorithm for this work is that the formations of the rounded rocks on the platform repeat while spinning it 1,000,000,000 times, so once a previous formation is discovered, the calculation can be short-cut significantly.

Day 15: Lens Library

🔎 Puzzle: https://adventofcode.com/2023/day/15, Solver: ULensLibrary.pas

Pretty straight-forward implementation of a Hashmap with a custom Hash function.

Day 16: The Floor Will Be Lava

🔎 Puzzle: https://adventofcode.com/2023/day/16, Solver: UFloorWillBeLava.pas

The solver calculates how a beam traverses through the grid until it is reflected outwards. Every time it hits a splitter, a new beam is put on a stack to be calculated later. I found the difficulty to be finding a good way to track how a beam has already traveled through the grid. This seems essential to detect when the calculation for a part of the beam can be aborted, since splitters can create loops. However, two beams could pass through the same tile in different ways without forming a loop. I settled for tracking four energy states for each tile of the grid, one being "not energized", two describing generically the two directions a beam could travel through some tiles, and one for the combination of those two directions. This energy state of the current field and the beam's direction could then be used to abandon a beam early, before it leaves the boundaries of the grid.

Once this was solved for one starting beam in part 1, I just iterated over all possible starting beams to find the maximum for part 2.

Day 17: Clumsy Crucible

🔎 Puzzle: https://adventofcode.com/2023/day/17, Solver: UClumsyCrucible.pas

I initially tried to solve this with a simple depth first search for the minimum path, while abandoning branches immediately when they exceed the current lowest known path value. However, this approach is way to costly, even on the small example data. It takes too long for a branch to be abandoned, and sub-paths are re-calculated many times for each of the branches.

Instead, the solver uses a somewhat Dijkstra-inspired algorithm, where for each location in the grid, it tracks two values, one for each 2D axis, which describe the lowest known accumulated heat loss when going from this grid location to the end point with the first step along the associated axis. The solver starts at the end point, where these two values are zero, and gradually calculates the minima for all grid points, while keeping track of grid locations that need recalculations. Once all grid points have been calculated, the result can be taken directly from the starting grid location.

The main modification to the classic algorithm here is that in order to calculate e.g. the horizontal current minimum for a grid point, the vertical current minimum of its vertical neighbors within a certain range have to be considered. The difference between part 1 and 2 is only the specific range to be used.

Day 18: Lavaduct Lagoon

🔎 Puzzle: https://adventofcode.com/2023/day/18, Solver: ULavaductLagoon.pas

My first algorithm for part 1 was a simply tracking the trench in a top-view two-dimensional array and then flood-filling the outside of the trench to determine the full area. It worked, but there were two problems. Firstly, I had to iteratre over the list of digs twice in order to avoid resizing the array frequently. Secondly, the performance complexity of the algorthim depends largely on the size of the array, i.e. the length of the individual digs, so obviously it did not scale for part2.

The final algorithm, uses the fact that either all right turns are convex or concave, locally, while all left turns are the opposite. That means that two consecutive turns in the same direction (a U-turn) enclose a rectangular area that is either inside or outside of the trench depending only on the direction of the two turns. So the algorthim simply collapses all U-turns it encounters into a straight dig instruction, thereby cutting of an area that is either added to or subtracted from the running area count.

These U-turn collapses are done immediately when adding digs because then the U-turns will always either be at the end of the list or just before the last collapse. One difficulty is that the in order for this to work well, the algorithm needs to ensure that consecutive digs are always perpendicular, merging any that are parallel into a single one.

Day 19: Aplenty

🔎 Puzzle: https://adventofcode.com/2023/day/19, Solver: UAplenty.pas

Since the workflows are at the beginning of the puzzle input, each machine part can be routed directly through the memorized workflows for part 1 when its line is processed. Each part starts at the in workflow and follows the checks and switches until it is either rejected or accepted. To benefit performance, the workflows cache links to each other for each switch, which are each set during the algorithm run after their first match.

For part two, a virtual "multi machine part" that represents all possible values of ratings, modelled as four integer intervals, is sent through the same workflow graph. Each time one of rules is applied to a multi machine part, it is split into up to three new multi machine parts that continue to go through the workflows on separate paths. This is similar to my day 5 solution.

Day 22: Sand Slabs

🔎 Puzzle: https://adventofcode.com/2023/day/22, Solver: USandSlabs.pas

I first sort the bricks with a custom compare function, such that they can be stacked in this order on the ground without passing through each other. Then they can be processed one by one on the ground directly, while tracking the current height of the ground positions, essentially the top-view of the ground plot, and which bricks connect vertically.

For part 1, if a brick lands on a single supporting brick, that brick below cannot be disintegrated anymore and is removed from the count, if it could have been disintegrated before.

For part 2, given a starting brick, the algorithm makes use of the tracked vertical connections to find a group of bricks supported by it, such that all supports of the bricks in the group are also in the group. This group of bricks would fall if the starting brick was disintegrated, so its size is counted for each possible starting brick.

Day 24: Never Tell Me the Odds

🔎 Puzzle: https://adventofcode.com/2023/day/24, Solver: UNeverTellMeTheOdds.pas

While I found part 1 quite trivial, part 2 left me with the feeling that my approach might be mad. Eventually, I managed to find the ray hitting all other rays by solving the general equation system for three known and one unknown rays with some shortcuts for this particular problem, for example assuming the existence of a unique solution. However, this involved excessive manual pre-calculations, arbitrary length integer arithmetic, and a root finder for integer polynomials, all implemented by myself without additional third-party libraries.

License

Copyright (C) 2023-2024 Stefan Müller

This program is free software: you can redistribute it and/or modify it under the terms of the GNU General Public License as published by the Free Software Foundation, either version 3 of the License, or (at your option) any later version.

This program is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU General Public License for more details.

You should have received a copy of the GNU General Public License along with this program. If not, see http://www.gnu.org/licenses/.