Solver for Advent of Code 2023 puzzles. https://adventofcode.com/2023/
Go to file
Stefan Müller 307eb14b55 Added solution for "Day 13: Point of Incidence", part 2 2023-12-14 00:32:27 +01:00
solvers Added solution for "Day 13: Point of Incidence", part 2 2023-12-14 00:32:27 +01:00
tests Added solution for "Day 13: Point of Incidence", part 2 2023-12-14 00:32:27 +01:00
.gitignore Removed FPCUnit ini settings file 2023-12-04 22:00:30 +01:00
AdventOfCode.lpi Added solution for "Day 13: Point of Incidence", part 1 2023-12-13 18:46:38 +01:00
AdventOfCode.lpr Added solution for "Day 13: Point of Incidence", part 1 2023-12-13 18:46:38 +01:00
LICENSE Initial commit 2023-12-01 23:03:05 +01:00
README.md Added solution for "Day 11: Cosmic Expansion", part 2 2023-12-11 16:45:56 +01:00
UNumberTheory.pas Added new unit with calculations of GCD and LCM 2023-12-09 14:46:06 +01:00
USolver.pas Changed result type in ISolver from Cardinal to Int64 for Day 8, part 2 2023-12-09 14:44:47 +01:00

README.md

Advent of Code 2023

Solver for Advent of Code 2023 puzzles.

This is a single command line application for all puzzles written in FreePascal with Lazarus 2.2.6 and compiled with FPC 3.2.2.

Day 1: Trebuchet?!

https://adventofcode.com/2023/day/1

My solution parses each line once forward for the right number, and once backward for the left number for both parts of the puzzle.

Day 2: Cube Conundrum

https://adventofcode.com/2023/day/2

That one seemed pretty straight forward. For each line, the solution immediately sums up games that fulfill the maxima and finds the maxima of each color.

Day 3: Gear Ratios

https://adventofcode.com/2023/day/3

For this I modified the solver class to pass in three lines at once, shifting one line down in each iteration, processing the numbers in the middle line and looking for additional symbols in the lines before and after. The tricky part was to correctly track the data needed for processing of each line and discarding it in time, without resorting to reading all data in before processing.

I introduced the test framework for this puzzle while stumbling over quite a few bugs.

Day 4: Scratchcards

https://adventofcode.com/2023/day/4

For part 1, the algorithm simply matches winning numbers against numbers we have, and multiplies the current line result by two for every match (except the first).

For part 2 there is a list of numbers of card copies for the upcoming cards, wher the list index is always relative to the current line. This works because the copies are always applied contiguously over upcoming cards. Once a card has been processed, its copy value is deleted from the beginning of the list (index 0).

Day 5: If You Give A Seed A Fertilizer

https://adventofcode.com/2023/day/5

Originally, I had implemented this by reading all data in first, constructing a list of the seven mappings, each containing a list of mapping ranges. I rewrote this when I realized that the conversion can be done line-by-line by maintaining separate lists of "unconverted" and "converted" values. Each mapping range is applied to all unconverted values, and if one matches it is converted and moved into the list of converted values. At the end of a map all converted values are moved back into the unconverted list. Unconverted values simply remain unconverted for the next map.

For part 2, it is not necessary (and not feasible) to convert the input ranges into individual values to run through the existing algorithm. Instead I modified the algorithm to run on ranges of input directly. This means that a successful conversion can split a range in up to three parts, where one is moved into the "converted" pile, while the others remain unconverted.

Day 6: Wait For It

https://adventofcode.com/2023/day/6

This one I solved by calculating the roots of the function f(x) = -time ^2 * x + distance and determining the distance between them. Part 2 was the first puzzle that required 64-bit integers for the calculations.

Day 7: Camel Cards

https://adventofcode.com/2023/day/7

The first puzzle that I could not solve line-by-line (day 6 doesn't count). For this one I store all the card hands and assign them a "type", e.g. "four of a kind", when processing them by counting the different card values in a hand. The rest of work is done in a custom compare function. When all data is processed I just use the compare function to sort all card hands, and then multiply the resulting indices with the bids.

For part 2, each card hands gets a "joker type" analoguous to the "type", for which the number of joker cards is added to the highest number of a different card type.

Day 8: Haunted Wasteland

https://adventofcode.com/2023/day/8

Again a puzzle where I had to read in all of the data before starting the algorithm. It proved difficult to verify parts of the algorithm by hand, but part 1 was still pretty straight forward.

Part 2 was a bit sneaky. This is the first puzzle where the result is outside the 32-bit unsigned integer range. And it is solvable only because each starting node leads into a loop with one of the target nodes, where the length of the loop is a multiple of the length of the sequence of instructions. With this knowledge, one can stop traversing the network once each target node has been reached and calculate the result directly.

Day 9: Mirage Maintenance

https://adventofcode.com/2023/day/9

This one I enjoyed the most so far. The process that is discribed in the puzzle, constructing a series of differences from the previous series, and then reverting the process to extend the series, is equivalent to finding a polynomial with maximum degree of n - 1, where the original series are n equidistant values of the polynomial.

So instead of using the outlined "brute force" method, I used Lagrange polynomials with x1 = 0, x2 = 1, ..., xn = n - 1 evaluated at x = n (for part 1) and x = -1 (for part 2) to find the function values for the extrapolated "points". Conveniently, the Lagrange polynomials can be precalculated for the whole puzzle (with some tricks to not run over Int64 limits) because they only depend on x values, which remain constant. This makes the calculation of the extrapolated values quite easy.

A nice explanation of the Lagrange method can be found here http://bueler.github.io/M310F11/polybasics.pdf.

Day 10: Pipe Maze

https://adventofcode.com/2023/day/10

The input data is such that there are only two pipes pointing to S, so it finding the loop is only a matter of following the chars as instructed. It seems best to read in the full input before trying to traverse the maze, I did not see another option. The length of the loop is always even, so my algorithm just follows the path until it is back to S and counts only every other step.

For part 2, I tracked tiles that are "left" and "right" of the path the algorithm took, and implemented a little flood-fill algorithm that tries to fill the area in between the "left" tiles and the "right" tiles, while counting them. The "outside" group is the one where the flood-fill touches the edge of the map and is simply ignored.

Day 11: Cosmic Expansion

https://adventofcode.com/2023/day/11

While parsing the input, we track coordinates of each galaxy, and for each row and column a 1 if it is empty and a 0 if not. At the end we sum for each pair of galaxies the values for each row and column between their coordinates +1 to get the sum of their (Manhattan) distances.

This approach was trivial to adapt for part 2, since all that was needed was another factor that had to be multiplied with the values tracked for the rows and columns before applying the +1.

License

Copyright (C) 2023 Stefan Müller

This program is free software: you can redistribute it and/or modify it under the terms of the GNU General Public License as published by the Free Software Foundation, either version 3 of the License, or (at your option) any later version.

This program is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU General Public License for more details.

You should have received a copy of the GNU General Public License along with this program. If not, see http://www.gnu.org/licenses/.