jzimbel's recent activity
-
Comment on Day 4: Ceres Search in ~comp.advent_of_code
-
Comment on Day 3: Mull It Over in ~comp.advent_of_code
jzimbel (edited )LinkElixir I decided to solve this one with regular expressions even though I'll probably regret it if future puzzles keep iterating on this concept. I learned about the (?|...) capture group for this...Elixir
I decided to solve this one with regular expressions even though I'll probably regret it if future puzzles keep iterating on this concept.
I learned about the
(?|...)
capture group for this one. Shout out to my pal regex101.com.Elixir's regex implementation has this nice
x
modifier that makes it ignore all whitespace within the pattern, as well as anything after "#" on a line. This lets you break the pattern up into separate lines and add inline comments. Great for readability!Both parts
defmodule AdventOfCode.Solution.Year2024.Day03 do use AdventOfCode.Solution.SharedParse @impl true def parse(input), do: String.trim(input) def part1(mem) do ~r/mul\((\d{1,3}),(\d{1,3})\)/ |> Regex.scan(mem, capture: :all_but_first) |> sum_of_products() end def part2(mem) do # (?| ...) drops unmatched alternatives from the returned captures. # E.g. matching "don't()" will produce ["don't()"] instead of ["", "", "", "don't()"] ~r""" (?| (mul)\((\d{1,3}),(\d{1,3})\) |(do)\(\) |(don't)\(\) ) """x |> Regex.scan(mem, capture: :all_but_first) |> apply_toggles() |> sum_of_products() end defp sum_of_products(factor_pairs) do factor_pairs |> Enum.map(fn [a, b] -> String.to_integer(a) * String.to_integer(b) end) |> Enum.sum() end defp apply_toggles(matches) do Enum.reduce(matches, %{on?: true, acc: []}, fn ["mul" | factor_pair], state when state.on? -> update_in(state.acc, &[factor_pair | &1]) ["mul" | _factor_pair], state when not state.on? -> state ["do"], state -> %{state | on?: true} ["don't"], state -> %{state | on?: false} end).acc end end
-
Comment on Day 2: Red-Nosed Reports in ~comp.advent_of_code
jzimbel Elixir I thought for a while about what a "smart" approach for part 2 would look like, but then I tried out the brute-force approach since it was a one-liner, and it ran in a few hundred...Elixir
I thought for a while about what a "smart" approach for part 2 would look like, but then I tried out the brute-force approach since it was a one-liner, and it ran in a few hundred microseconds so... seems good!
Both parts
In
safe_p1?
, theStream.scan
call lazily checks each pair of numbers, emitting:not_safe
if any pair is invalid. (The other values it emits aren't important as long as they are something other than:not_safe
.)The
Enum.any?
call stops and returns false as soon as it finds a:not_safe
in the stream.This combo is (IMO) a bit cleaner and easier to understand than a single
Enum.reduce_while
that achieves the same "stateful short-circuiting check" logic.
I use a boolean accumulator,
asc?
, to track whether the numbers are ascending (true
) or descending (false
). It starts out asnil
and gets set after the first interval is checked.defmodule AdventOfCode.Solution.Year2024.Day02 do use AdventOfCode.Solution.SharedParse def parse(input) do for line <- String.split(input, "\n", trim: true) do line |> String.split() |> Enum.map(&String.to_integer/1) end end def part1(reports), do: Enum.count(reports, &safe_p1?/1) def part2(reports), do: Enum.count(reports, &safe_p2?/1) defp safe_p1?(ns) do ns |> Enum.chunk_every(2, 1, :discard) |> Stream.scan(nil, &check_interval/2) |> Enum.all?(&(&1 != :not_safe)) end defp check_interval([a, b], asc?) do with true <- gradual?(a, b), {true, asc?} <- asc_desc?(a, b, asc?) do asc? else false -> :not_safe end end defp gradual?(a, b), do: abs(b - a) in 1..3 defp asc_desc?(a, b, nil), do: {true, b - a > 0} defp asc_desc?(a, b, asc?) when b - a > 0 == asc?, do: {true, asc?} defp asc_desc?(_a, _b, _asc?), do: false # Brute-force runs in a few hundred µs so I guess it's fine! defp safe_p2?(ns) do safe_p1?(ns) or Enum.any?( 0..(length(ns) - 1), &safe_p1?(List.delete_at(ns, &1)) ) end end
-
Comment on Day 1: Historian Hysteria in ~comp.advent_of_code
jzimbel Using Elixir again this year. After using it for 4 consecutive years, I've built up a nice project with various conveniences and a few modules for the data structures commonly used by AoC puzzles....Using Elixir again this year.
After using it for 4 consecutive years, I've built up a nice project with various conveniences and a few modules for the data structures commonly used by AoC puzzles.
I'm thinking about learning to use dataframes with Explorer (Elixir's answer to Python's
pandas
), but for now I just used the standard library.Parts 1 and 2
(The
SharedParse
thing signals to my solution runner that bothpart1
andpart2
can reuse the same value parsed from the input.)defmodule AdventOfCode.Solution.Year2024.Day01 do use AdventOfCode.Solution.SharedParse @impl AdventOfCode.Solution.SharedParse def parse(input) do input |> String.split() |> Enum.map(&String.to_integer/1) |> then( &[ _col_a = Enum.take_every(&1, 2), _col_b = Enum.take_every(tl(&1), 2) ] ) end def part1(cols) do cols |> Enum.map(&Enum.sort/1) |> Enum.zip() |> Enum.map(fn {a, b} -> abs(a - b) end) |> Enum.sum() end def part2([col_a, col_b]) do a_freq = Enum.frequencies(col_a) b_freq = Enum.frequencies(col_b) a_freq |> Enum.map(fn {n, freq} -> n * freq * Map.get(b_freq, n, 0) end) |> Enum.sum() end end
-
Comment on Annual gift of a Christmas tree from the people of Oslo in Norway to London continues – tree was cut down early yesterday morning in woods in the northern part of Oslo's capital in ~life
-
Comment on Megalopolis | Official trailer in ~movies
jzimbel YouTube says the video is private now. Just “This video is private” on a black background. What happened? Edit - It’s still available on other channels. MirrorYouTube says the video is private now. Just “This video is private” on a black background.
What happened?
Edit - It’s still available on other channels. Mirror
-
Comment on Where are you on the spectrum of vacation planning? Detailed to the hour or floating like a leaf in the wind? in ~travel
jzimbel Funny you mention Alcatraz, because my partner and I took this approach for a visit to SF and had an absolutely wonderful time—we booked a table at a nice restaurant and made plans with relatives...Funny you mention Alcatraz, because my partner and I took this approach for a visit to SF and had an absolutely wonderful time—we booked a table at a nice restaurant and made plans with relatives in the area, and otherwise just chose a different city park to explore each day. It was super cheap and super fun to get around with the municipal electric bikeshare and public transit.
-
Comment on Tildes Video Thread in ~misc
-
Comment on Day 22: Sand Slabs in ~comp.advent_of_code
jzimbel Ah gotcha, I had missed that the first new brick (the lower of the two) caused one of the seven bricks from the original example to become load-bearing when it previously wasn't. So the new count...Ah gotcha, I had missed that the first new brick (the lower of the two) caused one of the seven bricks from the original example to become load-bearing when it previously wasn't. So the new count of 5 was composed of 4 from the existing set of 7, and 1 from the new pair.
And yes, that fixed it!
-
Comment on Day 22: Sand Slabs in ~comp.advent_of_code
jzimbel Sorry if this is resurfacing bad debugging memories, but how does that even work? Both of the two new bricks are load-bearing? If 1,2,11~1,4,11 starts above all the others, how can it be...Sorry if this is resurfacing bad debugging memories, but how does that even work? Both of the two new bricks are load-bearing? If
1,2,11~1,4,11
starts above all the others, how can it be load-bearing?(I think I'm missing the same edge case as you were, thanks for sharing the extra example!)
-
Comment on Is there a programming language that brings you joy? in ~comp
jzimbel It’s elixir for me—while the syntax can look a bit odd at first, you eventually realize it’s composed of a small number of basic elements with several layers of optional syntactic sugar on top. At...It’s elixir for me—while the syntax can look a bit odd at first, you eventually realize it’s composed of a small number of basic elements with several layers of optional syntactic sugar on top. At its core, syntactically, it’s similar to a lisp. The language’s creator has actually described it as a lisp-2 language once or twice. This allows for robust and relatively user-friendly macros.
Separately from the syntax, the BEAM runtime’s focus on actor model/shared-nothing concurrency means you can (and are encouraged to) create applications that behave like a network of fault-tolerant mini-services rather than a single monolithic program. The way you structure your logic is just very different from other languages, and in a good way. You also don’t need to reach for additional technologies as often—for example, it’s trivial to set up a cache without memcached or the like, or cronjob-like tasks.
Having used elixir as the backend language at my job for the past 3 years, I can truthfully say that tracking down and fixing bugs in our elixir code has never taken more than a couple hours. The immutable functional style within each elixir process, and the supervisor tree / actor model between processes, generally makes for a very resilient and observable system.
-
Comment on ‘Winning requires hard work’: Wayfair CEO sends employees a gloomy pre-holiday email following layoff-filled year in ~finance
jzimbel Also worked at Wayfair, 2017-2020, as a SWE, and I also experienced a strong culture of “What are you still doing here at 5(:30), it’s not that serious, go home right now”. Upper management...Also worked at Wayfair, 2017-2020, as a SWE, and I also experienced a strong culture of “What are you still doing here at 5(:30), it’s not that serious, go home right now”. Upper management definitely leaned a bit sociopath, but in general most of the teams I worked on had decent managers who cared about their people’s wellbeing.
Niraj sent out “story time” emails like this weekly, I believe, most with a theme of “work hard, work smart”.
Not to defend the company—it was and is kind of a mess—but we all learned to basically tune out these emails from Niraj and co.
Try asking around about the “WayThrifty” one. lol. -
Comment on Day 14: Reflector Dish in ~comp.advent_of_code
jzimbel Elixir For some reason I have a lot of trouble wrapping my head around the math/logic to extrapolate a repeating sequence of values, not starting at index 0, to the nth element. This time, after...Elixir
For some reason I have a lot of trouble wrapping my head around the math/logic to extrapolate a repeating sequence of values, not starting at index 0, to the nth element. This time, after working it out again, I actually bothered to write a little function to do it. No more tears on these problems in the future. 🥲
New
Sequence
moduleThe function itself is short, but I wanted to make sure I'd have an easy time remembering how to use it when the next iteration of this "find the bazillionth iteration of some sequence" problem comes up again.
defmodule AdventOfCode.Sequence do @moduledoc """ Functions for infinitely repeating sequences. """ @doc """ Given an indexed list describing the repeating segment of an infinitely repeating sequence, returns the value at `at_index` of the sequence. Indices in the list do not need to start at 0, but they do need to be contiguous, and the list must be ordered by index, ascending. `at_index` can be any integer. """ @spec at(nonempty_list({a, index :: integer}), integer) :: a when a: term def at(repeat, i) do [{_value, start_index} | _] = repeat {value, _index} = Enum.at(repeat, Integer.mod(i - start_index, length(repeat))) value end end
Part 1 and part 2
I find it very intuitive to model this kind of "cycle of repeated operations" problem as a stream pipeline.
defmodule AdventOfCode.Solution.Year2023.Day14 do alias AdventOfCode.Grid, as: G alias AdventOfCode.Sequence def part1(input) do input |> G.from_input() |> tilt(:n) |> rock_load() end def part2(input) do [:n, :w, :s, :e] |> Stream.cycle() |> Stream.scan(G.from_input(input), &tilt(&2, &1)) |> Stream.drop(3) |> Stream.take_every(4) |> Stream.with_index(1) |> Enum.reduce_while(%{}, fn {grid, i}, indices -> case Map.fetch(indices, grid) do {:ok, repeat_start} -> {:halt, build_repeat_segment(indices, repeat_start, i - 1)} :error -> {:cont, Map.put(indices, grid, i)} end end) |> Sequence.at(1_000_000_000) end defp build_repeat_segment(indices, start_index, end_index) do repeat_indices = start_index..end_index//1 indices |> Stream.filter(fn {_g, j} -> j in repeat_indices end) |> Stream.map(fn {g, j} -> {rock_load(g), j} end) |> Enum.sort_by(&elem(&1, 1)) end defp tilt(grid, :n), do: do_tilt(grid, &G.cols/1, ?O, &G.from_cols/1) defp tilt(grid, :s), do: do_tilt(grid, &G.cols/1, ?., &G.from_cols/1) defp tilt(grid, :w), do: do_tilt(grid, &G.rows/1, ?O, &G.from_rows/1) defp tilt(grid, :e), do: do_tilt(grid, &G.rows/1, ?., &G.from_rows/1) defp do_tilt(grid, to_lanes, char_to_send_forward, from_lanes) do grid |> to_lanes.() |> Enum.map(fn lane -> lane |> Stream.map(fn {_coords, char} -> char end) |> Stream.chunk_by(&(&1 == ?#)) |> Enum.flat_map(fn [?# | _] = l -> l l -> char_to_front(l, char_to_send_forward) end) end) |> from_lanes.() end defp char_to_front(l, char, acc \\ []) defp char_to_front([], _char, acc), do: acc defp char_to_front([char | rest], char, acc), do: [char | char_to_front(rest, char, acc)] defp char_to_front([other | rest], char, acc), do: char_to_front(rest, char, [other | acc]) defp rock_load(grid) do grid |> G.rows() |> Stream.with_index() |> Stream.map(fn {row, i} -> (grid.height - i) * Enum.count(row, &match?({_, ?O}, &1)) end) |> Enum.sum() end end
-
Comment on Day 15: Lens Library in ~comp.advent_of_code
jzimbel It seemed a little weird to me as well for day 15, especially with how much they spelled out the instructions compared to past days from this year. Kind of a basic data structures problem.It seemed a little weird to me as well for day 15, especially with how much they spelled out the instructions compared to past days from this year. Kind of a basic data structures problem.
-
Comment on Day 15: Lens Library in ~comp.advent_of_code
jzimbel Elixir Very easy! List.keystore/4 and List.keydelete/3 were perfect for the map-building operations. Parts 1 and 2 Squeezed down to 24 mostly-illegible lines, because I felt like it defmodule...Elixir
Very easy!
List.keystore/4
andList.keydelete/3
were perfect for the map-building operations.Parts 1 and 2
Squeezed down to 24 mostly-illegible lines, because I felt like it
defmodule AdventOfCode.Solution.Year2023.Day15 do def part1(input), do: input |> parse() |> Stream.map(&hash/1) |> Enum.sum() def part2(input), do: input |> parse() |> to_map() |> stream_focusing_powers() |> Enum.sum() defp to_map(ops) do ops |> Stream.map(&Regex.run(~r/([^=]+)(?:-|=(.+))/, &1, capture: :all_but_first)) |> Enum.reduce(Map.new(0..255, &{&1, []}), fn [i, f], m -> Map.update!(m, hash(i), &List.keystore(&1, i, 0, {i, String.to_integer(f)})) [i], m -> Map.update!(m, hash(i), &List.keydelete(&1, i, 0)) end) end defp stream_focusing_powers(map) do Stream.flat_map(map, fn {box, lenses} -> lenses |> Stream.with_index(1) |> Stream.map(fn {{_label, f}, slot} -> (1 + box) * slot * f end) end) end defp hash(str), do: for(<<char <- str>>, reduce: 0, do: (acc -> rem((acc + char) * 17, 256))) defp parse(input), do: input |> String.split(",", trim: true) |> Stream.map(&String.trim/1) end
-
Comment on Day 16: The Floor Will Be Lava in ~comp.advent_of_code
jzimbel Elixir My code is pretty incomprehensible because I was messing around with macros to define functions with "unusual" names 🌝 The macro tomfoolery essentially allows the solution to turn the...Elixir
My code is pretty incomprehensible because I was messing around with macros to define functions with "unusual" names 🌝
The macro tomfoolery essentially allows the solution to turn the grid's tiles into direct function calls, no map lookups or anonymous functions necessary. It didn't really improve performance, but I had fun seeing what kind of nonsense I could get into.
Parts 1 and 2
defmodule AdventOfCode.Solution.Year2023.Day16 do defmodule Silliness do defmacro __using__(_) do quote do import Kernel, only: :macros def unquote(:.)(h), do: [h] def unquote(:/)({x, y}), do: [{-y, -x}] def unquote(:"\\")({x, y}), do: [{y, x}] def unquote(:-)({0, _}), do: [{-1, 0}, {1, 0}] def unquote(:-)(h), do: [h] def unquote(:|)({_, 0}), do: [{0, -1}, {0, 1}] def unquote(:|)(h), do: [h] end end end defmodule TurnOps, do: use(Silliness) alias AdventOfCode.Grid, as: G defstruct [:beams, history: MapSet.new()] def part1(input) do input |> G.from_input(&String.to_existing_atom(<<&1>>)) |> count_energized({{0, 0}, {1, 0}}) end def part2(input) do grid = G.from_input(input, &String.to_existing_atom(<<&1>>)) grid |> stream_init_beam_states() |> Task.async_stream(&count_energized(grid, &1), ordered: false) |> Stream.map(fn {:ok, count} -> count end) |> Enum.max() end defp count_energized(grid, init_beam_state) do %__MODULE__{beams: [init_beam_state]} |> Stream.iterate(&step_beams(&1, grid)) |> Enum.find(&match?(%{beams: []}, &1)) |> then(& &1.history) |> MapSet.new(fn {coords, _heading} -> coords end) |> MapSet.size() end defp step_beams(%{beams: beams, history: history}, grid) do %__MODULE__{ beams: Enum.flat_map(beams, &if(&1 in history, do: [], else: step_beam(&1, grid))), history: MapSet.union(history, MapSet.new(beams)) } end defp step_beam({coords, heading}, %{width: w, height: h} = grid) do apply(TurnOps, G.at(grid, coords), [heading]) |> Stream.map(fn new_heading -> {sum_coords(coords, new_heading), new_heading} end) |> Enum.filter(fn {{x, y}, _} -> x in 0..(w - 1)//1 and y in 0..(h - 1)//1 end) end defp stream_init_beam_states(%{width: w, height: h}) do [ Stream.flat_map(1..(h - 2)//1, fn y -> [{{0, y}, {1, 0}}, {{w - 1, y}, {-1, 0}}] end), Stream.flat_map(1..(w - 2)//1, fn x -> [{{x, 0}, {0, 1}}, {{x, h - 1}, {0, -1}}] end), Stream.flat_map([{0, 0}, {w - 1, 0}, {0, h - 1}, {w - 1, h - 1}], fn {x, y} -> x_headings = if x == 0, do: [0, 1], else: [0, -1] y_headings = if y == 0, do: [1, 0], else: [-1, 0] for heading <- Enum.zip(x_headings, y_headings), do: {{x, y}, heading} end) ] |> Stream.concat() end defp sum_coords({x1, y1}, {x2, y2}), do: {x1 + x2, y1 + y2} end
-
Comment on Day 12: Hot Spring in ~comp.advent_of_code
jzimbel (edited )LinkWelp, my updated solution for part 2 is passing unit tests (including checking the count for each line individually) but giving too high an answer for the real input. It seems like I'm missing...Welp, my updated solution for part 2 is passing unit tests (including checking the count for each line individually) but giving too high an answer for the real input. It seems like I'm missing some edge case that exists only in the real input, but I couldn't begin to guess what that is.
Could someone who has a working solution share their puzzle input, with each individual line tagged with its individual arrangement count? Only if you have time to help a stranger on the internet, ofc.
Maybe via pastebin / GH gist, considering how big the input is...edit: I figured it out! I wasn't checking that the arrangement contains no '#'s after the last contiguous segment.
Parts 1 and 2, now working
After changing it to run on all lines concurrently, and tuning the cache table appropriately, it solves part 2 in about 1.8 sec on my machine. :)
defmodule AdventOfCode.Solution.Year2023.Day12 do @table __MODULE__.MemoTable def part1(input) do input |> String.split("\n", trim: true) |> solve() end def part2(input) do input |> String.split("\n", trim: true) |> Stream.map(&quintuple/1) |> solve() end defp solve(lines) do :ets.new(@table, [ :named_table, :public, read_concurrency: true, write_concurrency: true, decentralized_counters: true ]) lines |> Stream.map(&parse_line/1) |> Task.async_stream( fn {pattern, contig_counts} -> memo_count_arrangements(pattern, contig_counts, 0) end, ordered: false ) |> Stream.map(fn {:ok, count} -> count end) |> Enum.sum() after :ets.delete(@table) end defp quintuple(line) do [springs_str, sizes_str] = String.split(line) [{springs_str, "?"}, {sizes_str, ","}] |> Enum.map(fn {s, j} -> List.duplicate(s, 5) |> Enum.join(j) end) |> Enum.join(" ") end defp memo_count_arrangements(pattern, contig_counts, offset) do memo({pattern, contig_counts, offset}, fn -> count_arrangements(pattern, contig_counts, offset) end) end defp count_arrangements(pattern, [contig_count | _], offset) when byte_size(pattern) - offset < contig_count, do: 0 defp count_arrangements(pattern, [], offset) do # If there are any '#'s in the remainder of the string, this arrangement is not valid. case :binary.match(pattern, "#", scope: {byte_size(pattern), offset - byte_size(pattern)}) do :nomatch -> 1 _ -> 0 end end defp count_arrangements(pattern, [contig_count | rest_contig], offset) do trailing_dot? = match?([_ | _], rest_contig) segment_size = contig_count + if(trailing_dot?, do: 1, else: 0) # The first '#' we find limits how far we can go. The last possible fit for this contig_count starts at that '#'. stop_after = case :binary.match(pattern, "#", scope: {byte_size(pattern), offset - byte_size(pattern)}) do {pos, _len} -> pos :nomatch -> :infinity end dot = if trailing_dot?, do: "[?.]", else: "" ~r""" [?\#] # This is the character that we'll get the index of. The first character of this valid contiguous section. (?= # Positive lookahead. Check that the correct characters follow the start of this contiguous section, without consuming them. [?\#]{#{contig_count - 1}} # Remaining characters of the contiguous section. #{dot} # Optional trailing '.', if this is not the final contiguous section. ) """x |> Regex.scan(pattern, offset: offset, return: :index) |> Stream.flat_map(fn [{pos, _len}] -> if pos <= stop_after, do: [pos], else: [] end) |> Stream.map(&memo_count_arrangements(pattern, rest_contig, &1 + segment_size)) |> Enum.sum() end defp memo(key, fun) do case :ets.match(@table, {key, :"$1"}) do [[value]] -> value [] -> tap(fun.(), &:ets.insert(@table, {key, &1})) end end defp parse_line(line) do [springs_str, sizes_str] = String.split(line) {springs_str, Enum.map(String.split(sizes_str, ","), &String.to_integer/1)} end end
-
Comment on Day 11: Cosmic Expansion in ~comp.advent_of_code
jzimbel Elixir This one was pretty straightforward. One of the more esoteric Enum functions, flat_map_reduce/3, came in handy here. Parts 1 and 2 I had some extra time to tweak things and see how they...Elixir
This one was pretty straightforward. One of the more esoteric
Enum
functions,flat_map_reduce/3
, came in handy here.Parts 1 and 2
I had some extra time to tweak things and see how they affected performance, so parts of the code are a bit more complicated or weird than they need to be, but hopefully it still mostly makes sense.
To avoid having to deal with a sparsely-filled grid after expanding along one axis, I did the expansions separately (and concurrently), with each producing a map looking like
original_galaxy_coordinates => new_x_or_y_coordinate
. Then, I merged the two maps together to get the full new coordinates of each galaxy.defmodule AdventOfCode.Solution.Year2023.Day11 do alias AdventOfCode.CharGrid, as: G def part1(input), do: solve(input, 1) def part2(input), do: solve(input, 999_999) def solve(input, spacing) do input |> G.from_input() |> expanded_galaxy_coords(spacing) |> pairwise_distances_sum() end defp expanded_galaxy_coords(grid, spacing) do [new_xs, new_ys] = Task.await_many([ Task.async(fn -> expanded_axis(grid, spacing, &G.cols/1, fn {x, _y} -> x end) end), Task.async(fn -> expanded_axis(grid, spacing, &G.rows/1, fn {_x, y} -> y end) end) ]) Enum.map(new_xs, fn {coords, x} -> {x, new_ys[coords]} end) end defp expanded_axis(grid, spacing, lanes_fn, axis_fn) do grid |> lanes_fn.() |> Enum.flat_map_reduce(0, fn lane, expand_by -> if Enum.all?(lane, &match?({_coords, ?.}, &1)) do {[], expand_by + spacing} else {for({coords, ?#} <- lane, do: {coords, axis_fn.(coords) + expand_by}), expand_by} end end) |> then(fn {new_axis, _expand_by} -> Map.new(new_axis) end) end defp pairwise_distances_sum(galaxy_coords, sum \\ 0) defp pairwise_distances_sum([], sum), do: sum defp pairwise_distances_sum([{x1, y1} | rest], sum) do new_sums = for({x2, y2} <- rest, reduce: 0, do: (acc -> acc + abs(x2 - x1) + abs(y2 - y1))) pairwise_distances_sum(rest, sum + new_sums) end end
-
Comment on Day 9: Mirage Maintenance in ~comp.advent_of_code
jzimbel Elixir Not much to say about this one, pretty straightforward. My only mistake was accidentally rerunning my part 1 solution and submitting that as my part 2 answer after seeing my tests pass....Elixir
Not much to say about this one, pretty straightforward. My only mistake was accidentally rerunning my part 1 solution and submitting that as my part 2 answer after seeing my tests pass.
Parts 1 and 2
I did a tiny bit of optimization by reversing the lists of numbers depending on which end I was interested in, so that I could do
hd(list)
(constant time) rather thanList.last(list)
(linear time). Probably could have avoided a bunch of near-duplicate code, or the non-tail-call recursion for part 2, but meh this works.defmodule AdventOfCode.Solution.Year2023.Day09 do def part1(input) do input |> parse_reverse() |> Enum.map(&extrapolate_forward/1) |> Enum.sum() end def part2(input) do input |> parse() |> Enum.map(&extrapolate_backward/1) |> Enum.sum() end defp extrapolate_forward(values, acc \\ 0) defp extrapolate_forward(values, acc) do if Enum.all?(values, &(&1 == 0)) do acc else values |> Enum.chunk_every(2, 1, :discard) # (List is reversed, so diff is left-right rather than right-left.) |> Enum.map(fn [b, a] -> b - a end) |> extrapolate_forward(acc + hd(values)) end end defp extrapolate_backward(values) defp extrapolate_backward(values) do if Enum.all?(values, &(&1 == 0)) do 0 else sub_result = values |> Enum.chunk_every(2, 1, :discard) |> Enum.map(fn [a, b] -> b - a end) |> extrapolate_backward() hd(values) - sub_result end end defp parse_reverse(input) do for line <- String.split(input, "\n", trim: true) do line |> String.split() |> Stream.map(&String.to_integer/1) |> Enum.reverse() end end defp parse(input) do for line <- String.split(input, "\n", trim: true) do for num <- String.split(line), do: String.to_integer(num) end end end
-
Comment on Day 8: Haunted Wasteland in ~comp.advent_of_code
jzimbel Elixir As many others have pointed out, part 2 wasn't very fun because I initially considered that each "ghost" might not always start on step 0 of its respective loop, thus making it far more...Elixir
As many others have pointed out, part 2 wasn't very fun because I initially considered that each "ghost" might not always start on step 0 of its respective loop, thus making it far more difficult to calculate where they match up. It wasn't so bad once I figured out that they all start on step 0 and don't have any "sub-loops" / pass through multiple
__Z
labels, but those properties weren't made clear by the puzzle description.New lil' Math module
There have been a handful of puzzles that involve calculating GCD/LCM, and I always forget how, so I figured I'd create a reusable module for those and other slightly-more-than-basic math functions.
defmodule AdventOfCode.Math do @moduledoc """ Common math functions/algorithms. """ @doc """ Greatest common denominator. """ @spec gcd(Enumerable.t(pos_integer)) :: pos_integer def gcd(ns), do: Enum.reduce(ns, &gcd/2) defp gcd(d, 0), do: d defp gcd(a, b), do: gcd(b, Integer.mod(a, b)) @doc """ Least common multiple. """ @spec lcm(Enumerable.t(pos_integer)) :: pos_integer def lcm(ns), do: Enum.reduce(ns, &lcm/2) defp lcm(a, b), do: div(a * b, gcd(a, b)) end
Parts 1 and 2
I've been stubbornly avoiding regexes for parsing inputs this year, because binary pattern matching is neat and weird. As long as the values you're trying to pull out of the string have a constant character length (variable length allowed if it's at the end of the string), binary pattern matching is sufficient.
defmodule AdventOfCode.Solution.Year2023.Day08 do alias AdventOfCode.Math @type t :: %__MODULE__{ labels: %{(index :: non_neg_integer) => String.t()}, step: non_neg_integer, zs: MapSet.t(step :: non_neg_integer) } @enforce_keys [:labels] defstruct @enforce_keys ++ [step: 0, zs: MapSet.new()] def new(labels) do labels |> Enum.with_index(fn l, i -> {i, l} end) |> Map.new() |> then(&%__MODULE__{labels: &1}) end def part1(input) do input |> parse() |> steps_to_z(["AAA"], &(&1 == "ZZZ")) end def part2(input) do input |> parse() |> steps_to_z(&match?(<<_, _, ?Z>>, &1)) end defp steps_to_z({dirs, map}, done?) do steps_to_z({dirs, map}, for(<<_, _, ?A>> = label <- Map.keys(map), do: label), done?) end defp steps_to_z({dirs, map}, starting_labels, done?) do dirs |> Stream.cycle() |> Enum.reduce_while(new(starting_labels), fn dir, acc -> if map_size(acc.labels) == 0 do {:halt, acc.zs} else {:cont, update_acc(acc, map, dir, done?)} end end) |> Math.lcm() end defp update_acc(acc, map, dir, done?) do found_z = Enum.flat_map(acc.labels, fn {i, l} -> if done?.(l), do: [i], else: [] end) labels = acc.labels |> Map.drop(found_z) |> Map.new(fn {i, l} -> {i, map[l][dir]} end) zs = if match?([_ | _], found_z), do: MapSet.put(acc.zs, acc.step), else: acc.zs %{acc | labels: labels, zs: zs, step: acc.step + 1} end defp parse(input) do [dirs, map] = String.split(input, "\n\n") dirs = for <<dir::1-bytes <- dirs>>, do: dir |> String.downcase() |> String.to_existing_atom() map = for <<label::3-bytes, _::4*8, left::3-bytes, _::2*8, right::3-bytes, _::bytes>> <- String.split(map, "\n", trim: true), into: %{}, do: {label, %{l: left, r: right}} {dirs, map} end end
Elixir
Pattern matching was useful for this one. I've also been iterating on a
Grid
module/data structure over several years of AoC puzzles, so I had an easy time parsing the input and moving around the grid.My approach for both parts was to find all the "starting points" of the desired patterns—
X
's for part 1 andA
's for part 2. Then, I checked locally around each one for matching patterns. EachX
could form up to 8XMAS
's, eachA
formed either 0 or 1 "X-MAS".Both parts
Just the solution code—here is my Grid module, if you're interested.
A fun thing: I updated
Grid.lines_of_values
to be optionally lazy-evaluating for this puzzle. Using the lazy version makes my part 1 solution run 10x faster! (It avoids unnecessary extra cell lookups.)