jzimbel's recent activity
-
Comment on Day 10: Hoof It in ~comp.advent_of_code
-
Comment on Day 11: Plutonian Pebbles in ~comp.advent_of_code
jzimbel I realized keeping the stones as strings was a silly choice. After switching to integers, my solution is a bit more concise and also runs faster. Both parts, using integers instead of strings...I realized keeping the stones as strings was a silly choice. After switching to integers, my solution is a bit more concise and also runs faster.
Both parts, using integers instead of strings
defmodule AdventOfCode.Solution.Year2024.Day11 do use AdventOfCode.Solution.SharedParse import Integer, only: [is_even: 1] @impl true def parse(input), do: input |> String.split() |> Enum.map(&String.to_integer/1) def part1(stones), do: solve(stones, 25) def part2(stones), do: solve(stones, 75) defp solve(stones, blinks) do stones |> collect() |> Stream.iterate(&blink/1) |> Enum.at(blinks) |> Enum.map(fn {_stone, n} -> n end) |> Enum.sum() end defp blink(stones, acc \\ []) defp blink([], acc), do: collect(acc) defp blink([{stone, n} | rest], acc) do acc = cond do stone == 0 -> [{1, n} | acc] is_even(digits = trunc(:math.log10(stone)) + 1) -> divisor = Integer.pow(10, div(digits, 2)) [{rem(stone, divisor), n}, {div(stone, divisor), n} | acc] true -> [{stone * 2024, n} | acc] end blink(rest, acc) end defp collect([{_stone, _n} | _] = stones) do stones |> Enum.group_by(fn {stone, _n} -> stone end, fn {_stone, n} -> n end) |> Enum.map(fn {stone, ns} -> {stone, Enum.sum(ns)} end) end defp collect([_stone | _] = stones) do stones |> Enum.frequencies() |> Map.to_list() end end
Benchmarks
Name ips average deviation median 99th % Parse 275.55 K 3.63 μs ±214.14% 3.50 μs 4.88 μs Part 1 1.48 K 676.32 μs ±52.08% 626.25 μs 1302.52 μs Part 2 0.0263 K 38024.57 μs ±3.23% 38143.92 μs 41155.89 μs Comparison: Parse 275.55 K Part 1 1.48 K - 186.36x slower +672.70 μs Part 2 0.0263 K - 10477.83x slower +38020.94 μs
-
Comment on Day 11: Plutonian Pebbles in ~comp.advent_of_code
jzimbel (edited )LinkAfter I got my efficient part 2 solution working, I tried continuing to 100 iterations and beyond and found something kind of neat! At some point, new unique numbers stop being generated. For my...After I got my efficient part 2 solution working, I tried continuing to 100 iterations and beyond and found something kind of neat! At some point, new unique numbers stop being generated. For my particular puzzle input, there are 3811 unique numbers that can be generated. The example input
125 17
can generate just 54 different numbers.Anyway,
Both parts (Elixir)
defmodule AdventOfCode.Solution.Year2024.Day11 do use AdventOfCode.Solution.SharedParse import Integer, only: [is_even: 1] @impl true def parse(input), do: String.split(input) def part1(stones), do: solve(stones, 25) def part2(stones), do: solve(stones, 75) defp solve(stones, blinks) do stones |> collect() |> Stream.iterate(&blink/1) |> Enum.at(blinks) |> Enum.map(fn {_stone, n} -> n end) |> Enum.sum() end defp blink(stones, acc \\ []) defp blink([], acc), do: collect(acc) defp blink([stone | stones], acc) do acc = case stone do {"0", n} -> [{"1", n} | acc] {stone, n} when is_even(byte_size(stone)) -> {a, b} = split(stone) [{b, n}, {a, n} | acc] {stone, n} -> stone = stone |> String.to_integer() |> Kernel.*(2024) |> Integer.to_string() [{stone, n} | acc] end blink(stones, acc) end defp split(str) do half = str |> byte_size() |> div(2) a = str |> binary_part(0, half) |> trim_zeros() b = str |> binary_part(half, half) |> trim_zeros() {a, b} end defp trim_zeros(str) do case String.trim_leading(str, "0") do "" -> "0" trimmed -> trimmed end end defp collect([{_stone, _n} | _] = stones) do stones |> Enum.group_by(fn {stone, _n} -> stone end, fn {_stone, n} -> n end) |> Enum.map(fn {stone, ns} -> {stone, Enum.sum(ns)} end) end defp collect([_stone | _] = stones) do stones |> Enum.frequencies() |> Map.to_list() end end
Benchmarks
Name ips average deviation median 99th % Parse 288130.82 0.00347 ms ±229.80% 0.00342 ms 0.00471 ms Part 1 774.97 1.29 ms ±6.84% 1.28 ms 1.46 ms Part 2 7.89 126.79 ms ±1.35% 126.76 ms 134.28 ms Comparison: Parse 288130.82 Part 1 774.97 - 371.80x slower +1.29 ms Part 2 7.89 - 36531.34x slower +126.78 ms
-
Comment on Day 9: Disk Fragmenter in ~comp.advent_of_code
jzimbel Behold, two abominations of pattern matching: the twin beasts, relocate_file/2 and clear_file/2. Look On My Works And Despair! Both parts (Elixir) (If you think it's incomprehensible now, you...Behold, two abominations of pattern matching: the twin beasts,
relocate_file/2
andclear_file/2
.Look On My Works And Despair!
Both parts (Elixir)
(If you think it's incomprehensible now, you should have how it looked before I defined the
hp
macro)defmodule AdventOfCode.Solution.Year2024.Day09 do use AdventOfCode.Solution.SharedParse @impl true def parse(input) do input |> String.trim() |> :binary.bin_to_list() # Parse all digits in the string |> Enum.map(&(&1 - ?0)) # Expand each digit to a file or free segment of the corresponding length |> Enum.flat_map_reduce({:file, 0}, fn n, {:file, i} -> {List.duplicate(i, n), {:free, i + 1}} n, {:free, i} -> {List.duplicate(:_, n), {:file, i}} end) |> then(fn {disk, _acc} -> disk end) end def part1(disk) do disk |> shift_files_left_p1() |> checksum() end def part2(disk) do disk |> shift_files_left_p2() |> checksum() end defp shift_files_left_p1(disk) do file_blocks = disk |> Enum.reject(&(&1 == :_)) |> Enum.reverse() {new_disk, _} = disk # Since all file blocks are being compacted left, # we know the number of blocks needed ahead of time. |> Enum.take(length(file_blocks)) |> Enum.map_reduce(file_blocks, fn # See a free block: emit a file block off the end :_, [file_block | file_blocks] -> {file_block, file_blocks} # See a file block: emit the file block file_block, file_blocks -> {file_block, file_blocks} end) new_disk end defp shift_files_left_p2(disk) do files = disk |> Enum.reject(&(&1 == :_)) |> Enum.reverse() |> Enum.chunk_by(& &1) disk = Enum.chunk_by(disk, & &1) files |> Enum.reduce(disk, &relocate_file/2) |> List.flatten() end # "Head pattern". Matches a list with pattern `pat` as head and anything as tail. # hp(:foo) expands to [:foo | _] # hp(my_var) expands to [my_var | _], binding `my_var` to the head of a nonempty list defmacrop hp(pat), do: quote(do: [unquote(pat) | _]) ### Puts file at its new position (if one exists) ### and then replaces its old position with free blocks using `clear_file` defp relocate_file(file, disk) defp relocate_file(file, [hp(:_) = free | rest]) when length(file) <= length(free) do # Found a fit [file, Enum.drop(free, length(file)) | clear_file(hd(file), rest)] end defp relocate_file(file, [hp(:_) = free | rest]) do # Too small [free | relocate_file(file, rest)] end defp relocate_file(hp(id), hp(hp(id)) = disk) do # Reached the original file without finding a fit disk end defp relocate_file(file, [other_file | rest]) do # A different file [other_file | relocate_file(file, rest)] end ### Clears moved file after `relocate_file` puts it in new position defp clear_file(id, disk) defp clear_file(id, [hp(:_) = l_free, hp(id) = file, hp(:_) = r_free | rest]) do # Found the file and it has free space both before and after it [l_free ++ List.duplicate(:_, length(file)) ++ r_free | rest] end defp clear_file(id, [hp(:_) = l_free, hp(id) = file | rest]) do # Found the file and it has free space before only [l_free ++ List.duplicate(:_, length(file)) | rest] end defp clear_file(id, [hp(id) = file, hp(:_) = r_free | rest]) do # Found the file and it has free space after only [List.duplicate(:_, length(file)) ++ r_free | rest] end defp clear_file(id, [hp(id) = file | rest]) do # Found the file and it's surrounded by other files [List.duplicate(:_, length(file)) | rest] end defp clear_file(id, [other | rest]) do # A free space or a different file [other | clear_file(id, rest)] end defp checksum(disk) do disk |> Enum.with_index() |> Enum.reject(&match?({:_, _i}, &1)) |> Enum.map(&Tuple.product/1) |> Enum.sum() end end
Benchmarks
Name ips average deviation median 99th % Parse 659.55 1.52 ms ±7.23% 1.50 ms 1.87 ms Part 1 277.49 3.60 ms ±6.45% 3.53 ms 4.24 ms Part 2 1.11 897.97 ms ±2.04% 898.05 ms 916.18 ms Comparison: Parse 659.55 Part 1 277.49 - 2.38x slower +2.09 ms Part 2 1.11 - 592.26x slower +896.45 ms
-
Comment on Day 7: Bridge Repair in ~comp.advent_of_code
jzimbel (edited )LinkI started out on this thinking I could do a little better than brute force by doing the following: sort each list of numbers and start testing with all +'s, so that each attempt would produce a...I started out on this thinking I could do a little better than brute force by doing the following:
- sort each list of numbers and start testing with all
+
's, so that each attempt would produce a generally increasing result* - stop testing once the results became greater than the target number, since further tests would produce only more numbers greater than the target.
Buuuuut sorting the list would cause the operations to be applied in the wrong order. That was it for my optimization ideas, so brute force it is!
Both parts (Elixir)
Somewhat out of laziness, I generated the op permutations from an incrementing integer.
E.g. On a line with n numbers, I would need n-1 operators.
For part 1, this would require testing up to 2(n-1) op combos, so I used an integer starting at 0 and incrementing to that number. When it was say, 6, that would translate to binary
0 1 1
, and then I'd translate each digit to an operator:+ * *
Part 2 was almost the same, except the combos were 3(n-1) and I converted the integer to ternary: 6 ->
0 2 0
->+ || +
.defmodule AdventOfCode.Solution.Year2024.Day07 do use AdventOfCode.Solution.SharedParse @impl true def parse(input) do for line <- String.split(input, "\n", trim: true) do [target, ns] = String.split(line, ":") {String.to_integer(target), for(n <- String.split(ns), do: String.to_integer(n))} end end def part1(equations), do: sum_solvable(equations, 2) def part2(equations), do: sum_solvable(equations, 3) defp sum_solvable(eqs, n_ops) do for {target, _ns} = eq <- eqs, solvable?(eq, n_ops), reduce: 0, do: (acc -> acc + target) end def solvable?({target, ns}, n_ops) do ns |> stream_calculations(n_ops) |> Enum.any?(&(&1 == target)) end def stream_calculations([], _n_ops), do: [] def stream_calculations([n], _n_ops), do: [n] def stream_calculations(ns, n_ops) do upper_bound = n_ops ** (length(ns) - 1) Stream.unfold(0, fn ^upper_bound -> nil op_combo -> {apply_ops(op_combo, ns, n_ops), op_combo + 1} end) end defp apply_ops(op_combo, ns, n_ops) do op_combo |> Integer.digits(n_ops) |> zero_pad(length(ns)) |> Enum.zip_reduce(ns, 0, &eval_op/3) end defp zero_pad(l, n), do: List.duplicate(0, max(0, n - length(l))) ++ l defp eval_op(0, b, a), do: a + b defp eval_op(1, b, a), do: a * b defp eval_op(2, b, a), do: a * 10 ** (1 + trunc(:math.log10(b))) + b end
Benchmarks
Is there any way to optimize this? (Besides the obvious parallelization by line)
It almost seems like a cryptography problem.Running the whole thing in one process:
Name ips average deviation median 99th % Parse 302.82 3.30 ms ±2.25% 3.28 ms 3.53 ms Part 1 34.66 28.85 ms ±5.40% 28.69 ms 37.07 ms Part 2 0.56 1777.75 ms ±0.59% 1783.57 ms 1783.96 ms Comparison: Parse 302.82 Part 1 34.66 - 8.74x slower +25.55 ms Part 2 0.56 - 538.34x slower +1774.45 ms
After switching to a
Task.async_stream
call to check each line concurrently:Name ips average deviation median 99th % Part 1 105.77 9.45 ms ±2.56% 9.45 ms 10.17 ms Part 2 3.08 324.21 ms ±2.38% 324.76 ms 333.83 ms
* (tbh I think this logic was flawed as well. For example, 5 + 1 = 6 and 5 * 1 = 5.)
- sort each list of numbers and start testing with all
-
Comment on Day 8: Resonant Collinearity in ~comp.advent_of_code
jzimbel Straightforward. This is the 4th day in a row that I've used streams to represent infinite sequences, it works well and can turn a lot of problems normally solved with loops into essentially...Straightforward. This is the 4th day in a row that I've used streams to represent infinite sequences, it works well and can turn a lot of problems normally solved with loops into essentially operating on a list, or lists. It lets you separate the logic that generates values from the logic that acts on those values.
Both parts (Elixir)
I went a little over the top abstracting my code so both parts can reuse a single
solve/2
function...defmodule AdventOfCode.Solution.Year2024.Day08 do use AdventOfCode.Solution.SharedParse alias AdventOfCode.Grid, as: G @impl true def parse(input), do: G.from_input(input) def part1(grid), do: solve(grid, &Enum.slice(&1, 1, 1)) def part2(grid), do: solve(grid) defp solve(grid, stream_limiter \\ &Function.identity/1) do grid |> antenna_groups() |> Enum.map(&antinode_streams/1) |> List.flatten() |> Enum.map(stream_limiter) |> Enum.flat_map(fn s -> Enum.take_while(s, &G.in_bounds?(grid, &1)) end) |> MapSet.new() |> MapSet.size() end defp antenna_groups(grid) do grid |> G.filter_cells(fn {_coords, char} -> char != ?. end) |> Enum.group_by( fn {_coords, char} -> char end, fn {coords, _char} -> coords end ) |> Map.values() end # Produces pairs of streams that step outward from 2 antennae, forever defp antinode_streams(coords) do for {c1, i} <- Enum.with_index(coords), c2 <- Enum.drop(coords, i + 1) do diff_forwards = subtract(c2, c1) diff_backwards = invert(diff_forwards) [ Stream.iterate(c2, &add(&1, diff_forwards)), Stream.iterate(c1, &add(&1, diff_backwards)) ] end end defp add({x1, y1}, {x2, y2}), do: {x1 + x2, y1 + y2} defp subtract({x1, y1}, {x2, y2}), do: {x1 - x2, y1 - y2} defp invert({x, y}), do: {-x, -y} end
-
Comment on Day 6: Guard Gallivant in ~comp.advent_of_code
jzimbel Part 2 was fun. I had a basic plan going into it, but came up with a few more little optimizations along the way. Some of these optimizations were large time complexity -> space complexity...Part 2 was fun. I had a basic plan going into it, but came up with a few more little optimizations along the way.
Some of these optimizations were large time complexity -> space complexity tradeoffs, though, as I was holding onto a lot of history along each step of each re-run of the guard's walk.
The end result runs in 1.8 sec, which seems pretty solid based on what others have reported.
One of the main tricky things was knowing when I needed to be checking just the guard's position vs her position and heading. I had to fix a few bugs related to this. I probably didn't do myself any favors by using nested tuples to structure the data. Maps or even structs would have made it easier to make sense of things.
Since functional programming is all about obstinately refusing to use loops like a normal person (They're procedural!! 😱🤢🧐) I modeled the guard states using
Stream
functions, which can lazily produce and manipulate infinite series.Both parts
I used my existing
Grid
module to parse the input. I might create a companionHeading
module soon as well, because I keep treating headings as a special case of points/coordinates, when they're kind of a whole different thing.defmodule AdventOfCode.Solution.Year2024.Day06 do use AdventOfCode.Solution.SharedParse alias AdventOfCode.Grid, as: G @impl true def parse(input) do grid = G.from_input(input) [{guard_start, ?^}] = G.filter_cells(grid, &match?({_, ?^}, &1)) # Replace ^ with . grid = G.replace(grid, guard_start, ?.) {grid, guard_start} end def part1({grid, guard_start}) do grid |> stream_steps(guard_start) |> MapSet.new(fn {coords, _heading} -> coords end) |> MapSet.size() end def part2({grid, guard_start}) do path = grid |> stream_steps(guard_start) |> Enum.to_list() # Try placing walls along the guard's original path, one at a time path |> Stream.with_index(fn {coords, _heading}, i -> {coords, i} end) # If the path crosses over itself, we don't need to try placing a wall there again |> Stream.uniq_by(fn {coords, _i} -> coords end) # Can't place a wall on top of the guard |> Stream.drop(1) |> Enum.count(fn {coords, i} -> wall_causes_loop?(grid, coords, i, path) end) end defp wall_causes_loop?(grid, wall_coords, wall_step_idx, path) do # Place the wall grid = G.replace(grid, wall_coords, ?#) # Fast-forward to the step just before the new wall step_before_wall = Enum.at(path, wall_step_idx - 1) {coords, heading} = step_before_wall visited_before_wall = MapSet.new(Enum.take(path, max(0, wall_step_idx - 2))) # See if the guard repeats a step from there grid |> stream_steps(coords, heading) |> Enum.reduce_while(visited_before_wall, fn step, visited -> if step in visited do {:halt, :loop} else {:cont, MapSet.put(visited, step)} end end) |> Kernel.==(:loop) end @spec stream_steps(G.t(), G.coordinates()) :: Enumerable.t({G.coordinates(), G.heading()}) @spec stream_steps(G.t(), G.coordinates(), G.heading()) :: Enumerable.t({G.coordinates(), G.heading()}) defp stream_steps(grid, coords, heading \\ {0, -1}) do Stream.unfold({coords, heading}, fn nil -> nil {coords, heading} -> next_coords = G.sum_coordinates(coords, heading) case G.at(grid, next_coords) do nil -> {{coords, heading}, nil} ?. -> {{coords, heading}, {next_coords, heading}} ?# -> {{coords, heading}, {coords, turn_right(heading)}} end end) end defp turn_right({x, y}), do: {-y, x} end
Benchmarks
Name ips average deviation median 99th % Part 1 1238.78 0.00081 s ±3.68% 0.00082 s 0.00086 s Parse 428.80 0.00233 s ±5.56% 0.00230 s 0.00275 s Part 2 0.55 1.80 s ±1.86% 1.79 s 1.84 s Comparison: Part 1 1238.78 Parse 428.80 - 2.89x slower +0.00152 s Part 2 0.55 - 2234.52x slower +1.80 s
-
Comment on Day 5: Print Queue in ~comp.advent_of_code
jzimbel Elixir Pretty straightforward, but it did take some up-front data structure planning. I ended up parsing the rules into a map of %{page => set(pages_that_come_after_it)}. Originally I was storing...Elixir
Pretty straightforward, but it did take some up-front data structure planning. I ended up parsing the rules into a map of
%{page => set(pages_that_come_after_it)}
. Originally I was storing two sets under each key—pages that came before the key and pages that came after. But it turned out I only needed one set.One tricky case was the rightmost page, e.g. 13 in the example input. I had a bug at first that didn't create a map entry for that page, since it never appeared on the LHS. I fixed this by creating a
page => empty_set
entry for each RHS page, only if an entry didn't already exist, as I built the map.My solution runs pretty fast. I believe it's close to the optimal pure-Elixir solution, without doing things like single-pass parse + solve that sacrifice readability.
Name ips average deviation median 99th % Part 1 8.67 K 115.40 μs ±6.12% 113.08 μs 133.42 μs Part 2 3.97 K 251.99 μs ±12.02% 244.75 μs 333.42 μs Parse 2.06 K 486.24 μs ±6.03% 476.63 μs 585.01 μs Comparison: Part 1 8.67 K Part 2 3.97 K - 2.18x slower +136.59 μs Parse 2.06 K - 4.21x slower +370.83 μs
Both parts
I decided to use a lot of comprehensions for this one, for no particular reason.
defmodule AdventOfCode.Solution.Year2024.Day05 do use AdventOfCode.Solution.SharedParse @typep page :: integer @impl true @spec parse(String.t()) :: { # k comes before all pages in v %{page => MapSet.t(page)}, [[page]] } def parse(input) do [rules, updates] = String.split(input, "\n\n", trim: true) {parse_afters(rules), parse_updates(updates)} end defp parse_afters(rules) do for rule <- String.split(rules, "\n", trim: true), reduce: %{} do afters -> [before, after_] = for page <- String.split(rule, "|"), do: String.to_integer(page) afters |> Map.update(before, MapSet.new([after_]), &MapSet.put(&1, after_)) # Adds an entry for the last page (which has no afters) |> Map.put_new(after_, MapSet.new()) end end defp parse_updates(updates) do for update <- String.split(updates, "\n", trim: true) do for page <- String.split(update, ",") do String.to_integer(page) end end end def part1({afters, updates}) do for(update <- updates, ordered?(update, afters), do: update) |> sum_middles() end def part2({afters, updates}) do for(update <- updates, sorted = sorted_or_nil(update, afters), sorted != nil, do: sorted) |> sum_middles() end defp ordered?(update, afters) do update |> Enum.chunk_every(2, 1, :discard) |> Enum.all?(&page_pair_ordered?(&1, afters)) end defp sorted_or_nil(update, afters) do case Enum.sort(update, &page_pair_ordered?([&1, &2], afters)) do ^update -> nil sorted -> sorted end end defp page_pair_ordered?([l, r], afters), do: r in afters[l] defp sum_middles(updates) do for update <- updates, reduce: 0 do acc -> acc + Enum.at(update, div(length(update), 2)) end end end
-
Comment on Day 4: Ceres Search in ~comp.advent_of_code
jzimbel (edited )LinkElixir Pattern matching was useful for this one. I've also been iterating on a Grid module/data structure over several years of AoC puzzles, so I had an easy time parsing the input and moving...Elixir
Pattern matching was useful for this one. I've also been iterating on a
Grid
module/data structure over several years of AoC puzzles, so I had an easy time parsing the input and moving around the grid.My approach for both parts was to find all the "starting points" of the desired patterns—
X
's for part 1 andA
's for part 2. Then, I checked locally around each one for matching patterns. EachX
could form up to 8XMAS
's, eachA
formed either 0 or 1 "X-MAS".Both parts
Just the solution code—here is my Grid module, if you're interested.
A fun thing: I updated
Grid.lines_of_values
to be optionally lazy-evaluating for this puzzle. Using the lazy version makes my part 1 solution run 10x faster! (It avoids unnecessary extra cell lookups.)defmodule AdventOfCode.Solution.Year2024.Day04 do alias AdventOfCode.Grid, as: G use AdventOfCode.Solution.SharedParse @impl true def parse(input), do: G.from_input(input) @spec part1(G.t()) :: integer() def part1(grid) do grid |> G.filter_cells(&match?({_, ?X}, &1)) |> Enum.map(&count_xmas(&1, grid)) |> Enum.sum() end def part2(grid) do grid |> G.filter_cells(&match?({_, ?A}, &1)) |> Enum.count(&x_mas?(&1, grid)) end defp count_xmas({coords, ?X}, grid) do grid |> G.lines_of_values(coords, :all, true) |> Enum.count(&match?(~c"MAS", Enum.take(&1, 3))) end defp x_mas?({coords, ?A}, grid) do case G.adjacent_values(grid, coords, :intercardinal) do [a, a, b, b] when a != b and a in ~c"MS" and b in ~c"MS" -> true [a, b, a, b] when a != b and a in ~c"MS" and b in ~c"MS" -> true _not_x_mas_or_A_is_on_edge_of_grid -> false end end end
-
Comment on Day 3: Mull It Over in ~comp.advent_of_code
jzimbel (edited )LinkElixir I decided to solve this one with regular expressions even though I'll probably regret it if future puzzles keep iterating on this concept. I learned about the (?|...) capture group for this...Elixir
I decided to solve this one with regular expressions even though I'll probably regret it if future puzzles keep iterating on this concept.
I learned about the
(?|...)
capture group for this one. Shout out to my pal regex101.com.Elixir's regex implementation has this nice
x
modifier that makes it ignore all whitespace within the pattern, as well as anything after "#" on a line. This lets you break the pattern up into separate lines and add inline comments. Great for readability!Both parts
defmodule AdventOfCode.Solution.Year2024.Day03 do use AdventOfCode.Solution.SharedParse @impl true def parse(input), do: String.trim(input) def part1(mem) do ~r/mul\((\d{1,3}),(\d{1,3})\)/ |> Regex.scan(mem, capture: :all_but_first) |> sum_of_products() end def part2(mem) do # (?| ...) drops unmatched alternatives from the returned captures. # E.g. matching "don't()" will produce ["don't()"] instead of ["", "", "", "don't()"] ~r""" (?| (mul)\((\d{1,3}),(\d{1,3})\) |(do)\(\) |(don't)\(\) ) """x |> Regex.scan(mem, capture: :all_but_first) |> apply_toggles() |> sum_of_products() end defp sum_of_products(factor_pairs) do factor_pairs |> Enum.map(fn [a, b] -> String.to_integer(a) * String.to_integer(b) end) |> Enum.sum() end defp apply_toggles(matches) do Enum.reduce(matches, %{on?: true, acc: []}, fn ["mul" | factor_pair], state when state.on? -> update_in(state.acc, &[factor_pair | &1]) ["mul" | _factor_pair], state when not state.on? -> state ["do"], state -> %{state | on?: true} ["don't"], state -> %{state | on?: false} end).acc end end
-
Comment on Day 2: Red-Nosed Reports in ~comp.advent_of_code
jzimbel Elixir I thought for a while about what a "smart" approach for part 2 would look like, but then I tried out the brute-force approach since it was a one-liner, and it ran in a few hundred...Elixir
I thought for a while about what a "smart" approach for part 2 would look like, but then I tried out the brute-force approach since it was a one-liner, and it ran in a few hundred microseconds so... seems good!
Both parts
In
safe_p1?
, theStream.scan
call lazily checks each pair of numbers, emitting:not_safe
if any pair is invalid. (The other values it emits aren't important as long as they are something other than:not_safe
.)The
Enum.any?
call stops and returns false as soon as it finds a:not_safe
in the stream.This combo is (IMO) a bit cleaner and easier to understand than a single
Enum.reduce_while
that achieves the same "stateful short-circuiting check" logic.
I use a boolean accumulator,
asc?
, to track whether the numbers are ascending (true
) or descending (false
). It starts out asnil
and gets set after the first interval is checked.defmodule AdventOfCode.Solution.Year2024.Day02 do use AdventOfCode.Solution.SharedParse def parse(input) do for line <- String.split(input, "\n", trim: true) do line |> String.split() |> Enum.map(&String.to_integer/1) end end def part1(reports), do: Enum.count(reports, &safe_p1?/1) def part2(reports), do: Enum.count(reports, &safe_p2?/1) defp safe_p1?(ns) do ns |> Enum.chunk_every(2, 1, :discard) |> Stream.scan(nil, &check_interval/2) |> Enum.all?(&(&1 != :not_safe)) end defp check_interval([a, b], asc?) do with true <- gradual?(a, b), {true, asc?} <- asc_desc?(a, b, asc?) do asc? else false -> :not_safe end end defp gradual?(a, b), do: abs(b - a) in 1..3 defp asc_desc?(a, b, nil), do: {true, b - a > 0} defp asc_desc?(a, b, asc?) when b - a > 0 == asc?, do: {true, asc?} defp asc_desc?(_a, _b, _asc?), do: false # Brute-force runs in a few hundred µs so I guess it's fine! defp safe_p2?(ns) do safe_p1?(ns) or Enum.any?( 0..(length(ns) - 1), &safe_p1?(List.delete_at(ns, &1)) ) end end
-
Comment on Day 1: Historian Hysteria in ~comp.advent_of_code
jzimbel Using Elixir again this year. After using it for 4 consecutive years, I've built up a nice project with various conveniences and a few modules for the data structures commonly used by AoC puzzles....Using Elixir again this year.
After using it for 4 consecutive years, I've built up a nice project with various conveniences and a few modules for the data structures commonly used by AoC puzzles.
I'm thinking about learning to use dataframes with Explorer (Elixir's answer to Python's
pandas
), but for now I just used the standard library.Parts 1 and 2
(The
SharedParse
thing signals to my solution runner that bothpart1
andpart2
can reuse the same value parsed from the input.)defmodule AdventOfCode.Solution.Year2024.Day01 do use AdventOfCode.Solution.SharedParse @impl AdventOfCode.Solution.SharedParse def parse(input) do input |> String.split() |> Enum.map(&String.to_integer/1) |> then( &[ _col_a = Enum.take_every(&1, 2), _col_b = Enum.take_every(tl(&1), 2) ] ) end def part1(cols) do cols |> Enum.map(&Enum.sort/1) |> Enum.zip() |> Enum.map(fn {a, b} -> abs(a - b) end) |> Enum.sum() end def part2([col_a, col_b]) do a_freq = Enum.frequencies(col_a) b_freq = Enum.frequencies(col_b) a_freq |> Enum.map(fn {n, freq} -> n * freq * Map.get(b_freq, n, 0) end) |> Enum.sum() end end
-
Comment on Annual gift of a Christmas tree from the people of Oslo in Norway to London continues – tree was cut down early yesterday morning in woods in the northern part of Oslo's capital in ~life
-
Comment on Megalopolis | Official trailer in ~movies
jzimbel YouTube says the video is private now. Just “This video is private” on a black background. What happened? Edit - It’s still available on other channels. MirrorYouTube says the video is private now. Just “This video is private” on a black background.
What happened?
Edit - It’s still available on other channels. Mirror
-
Comment on Where are you on the spectrum of vacation planning? Detailed to the hour or floating like a leaf in the wind? in ~travel
jzimbel Funny you mention Alcatraz, because my partner and I took this approach for a visit to SF and had an absolutely wonderful time—we booked a table at a nice restaurant and made plans with relatives...Funny you mention Alcatraz, because my partner and I took this approach for a visit to SF and had an absolutely wonderful time—we booked a table at a nice restaurant and made plans with relatives in the area, and otherwise just chose a different city park to explore each day. It was super cheap and super fun to get around with the municipal electric bikeshare and public transit.
-
Comment on Tildes Video Thread in ~misc
-
Comment on Day 22: Sand Slabs in ~comp.advent_of_code
jzimbel Ah gotcha, I had missed that the first new brick (the lower of the two) caused one of the seven bricks from the original example to become load-bearing when it previously wasn't. So the new count...Ah gotcha, I had missed that the first new brick (the lower of the two) caused one of the seven bricks from the original example to become load-bearing when it previously wasn't. So the new count of 5 was composed of 4 from the existing set of 7, and 1 from the new pair.
And yes, that fixed it!
-
Comment on Day 22: Sand Slabs in ~comp.advent_of_code
jzimbel Sorry if this is resurfacing bad debugging memories, but how does that even work? Both of the two new bricks are load-bearing? If 1,2,11~1,4,11 starts above all the others, how can it be...Sorry if this is resurfacing bad debugging memories, but how does that even work? Both of the two new bricks are load-bearing? If
1,2,11~1,4,11
starts above all the others, how can it be load-bearing?(I think I'm missing the same edge case as you were, thanks for sharing the extra example!)
-
Comment on Is there a programming language that brings you joy? in ~comp
jzimbel It’s elixir for me—while the syntax can look a bit odd at first, you eventually realize it’s composed of a small number of basic elements with several layers of optional syntactic sugar on top. At...It’s elixir for me—while the syntax can look a bit odd at first, you eventually realize it’s composed of a small number of basic elements with several layers of optional syntactic sugar on top. At its core, syntactically, it’s similar to a lisp. The language’s creator has actually described it as a lisp-2 language once or twice. This allows for robust and relatively user-friendly macros.
Separately from the syntax, the BEAM runtime’s focus on actor model/shared-nothing concurrency means you can (and are encouraged to) create applications that behave like a network of fault-tolerant mini-services rather than a single monolithic program. The way you structure your logic is just very different from other languages, and in a good way. You also don’t need to reach for additional technologies as often—for example, it’s trivial to set up a cache without memcached or the like, or cronjob-like tasks.
Having used elixir as the backend language at my job for the past 3 years, I can truthfully say that tracking down and fixing bugs in our elixir code has never taken more than a couple hours. The immutable functional style within each elixir process, and the supervisor tree / actor model between processes, generally makes for a very resilient and observable system.
-
Comment on ‘Winning requires hard work’: Wayfair CEO sends employees a gloomy pre-holiday email following layoff-filled year in ~finance
jzimbel Also worked at Wayfair, 2017-2020, as a SWE, and I also experienced a strong culture of “What are you still doing here at 5(:30), it’s not that serious, go home right now”. Upper management...Also worked at Wayfair, 2017-2020, as a SWE, and I also experienced a strong culture of “What are you still doing here at 5(:30), it’s not that serious, go home right now”. Upper management definitely leaned a bit sociopath, but in general most of the teams I worked on had decent managers who cared about their people’s wellbeing.
Niraj sent out “story time” emails like this weekly, I believe, most with a theme of “work hard, work smart”.
Not to defend the company—it was and is kind of a mess—but we all learned to basically tune out these emails from Niraj and co.
Try asking around about the “WayThrifty” one. lol.
I'd been avoiding this one because it seemed like it would be a cumbersome pathfinding problem, but it turned out to be... very easy?
The only difference between part 1 and part 2 was whether to keep duplicates in my list of in-progress paths. Annoyingly, the
for
special form (aka comprehensions) has a weird limitation where its:uniq
option only accepts boolean literals, so I had to conditionally filter to unique values outside of the comprehension.Both parts (Elixir)
Benchmarks