jzimbel's recent activity

  1. Comment on What's your unpopular food opinion or idiosyncrasy? in ~food

    jzimbel
    Link
    MSG is fine. It should be right up there with salt as a commonly used seasoning. It makes anything savory taste better, and it’s naturally-occurring in most of the foods that we already use to add...

    MSG is fine. It should be right up there with salt as a commonly used seasoning. It makes anything savory taste better, and it’s naturally-occurring in most of the foods that we already use to add an umami kick to a dish. (Cheese, mushrooms, seaweed, tomatoes, peas, kimchi, and more)

    17 votes
  2. Comment on Why store code as text files? in ~comp

    jzimbel
    (edited )
    Link
    Regarding formatting, most projects with multiple contributors these days just use an automatic formatter and enforce its usage with a check in their continuous integration pipelines. Of the...

    Regarding formatting, most projects with multiple contributors these days just use an automatic formatter and enforce its usage with a check in their continuous integration pipelines. Of the languages I have used, I can think of 2 with formatters that ship directly with the language (gofmt, mix format) and 2 others that have third-party formatters that the community has generally settled on using (prettier, black).

    In my opinionated opinion, standardizing on a formatter is by far the simpler approach, and saves tons of time in code review. There have been very few cases where the formatter made my or my reviewees’ code less readable, and I’ll gladly accept that in exchange for its complete removal of person-by-person, line-by-line subjectivity from that aspect of software development.

    Tl;dr: Save everyone’s time and let the authors of an official auto-formatter decide what’s best. Standardization is extremely valuable.

    2 votes
  3. Comment on Judge says BNSF unions can’t strike over new attendance rule in ~news

    jzimbel
    Link
    I mean… isn’t the entire point of a strike to demonstrate the value of your labor by letting your employer and the general public see the negative consequences when you collectively withhold it?

    Judge Mark Pittman said a strike would hurt BNSF and could cause significant damage to the economy because of the role the railroad plays in delivering all kinds of goods.

    I mean… isn’t the entire point of a strike to demonstrate the value of your labor by letting your employer and the general public see the negative consequences when you collectively withhold it?

    13 votes
  4. Comment on Solving the operator ‘shortage’ by not running transit like a business in ~finance

    jzimbel
    Link
    This post is by someone who previously worked at the transit authority I work at, who used her deep understanding of power structures gained from a community organizing background to make positive...

    This post is by someone who previously worked at the transit authority I work at, who used her deep understanding of power structures gained from a community organizing background to make positive changes, and whom I really respect.

    I’m sharing her writing in this more general space because it touches on a handful of interesting changes to how companies/organizations are structured and source their talent in both the private and public sectors, and the impacts of those changes.

    If there is a better place for this than ~misc, I apologize. It didn’t seem to fit neatly in any of the other groups.

    As in the example at Apple, outsourcing meant that, along with any lower wages or benefits, workers in the outsourced positions likely have a limited career pathway with the private employer. Previously, an employee could have worked as a customer service agent during a long career at the MBTA, but today, being a T Ambassador [a contracted customer assistance worker stationed at transit stops] is less likely to be a stepping stone in a career than a short stint in a string of low-wage jobs.

    I suspect that the desire to have ‘professional’ managers is, at least in part, to create an experience barrier between executives and workers under the theory that someone who worked their way up would be less inclined to cut costs in ways that impact the people following behind them. Regardless of the motivation, the impact is a limit on the upward career mobility of the workforce and decrease in the incentives to invest in their professional development. In transit agencies like the MBTA this increases the demographic divide between the frontline workforce, which is predominantly people of color, and management, which is significantly white.

    4 votes
  5. Comment on Day 24: Arithmetic Logic Unit in ~comp.advent_of_code

    jzimbel
    (edited )
    Link
    Anyone have a hint for part 1? Seems like building the ALU is the easy part, and the real challenge is figuring out what the validator program does since a brute-force approach (trying every...

    Anyone have a hint for part 1? Seems like building the ALU is the easy part, and the real challenge is figuring out what the validator program does since a brute-force approach (trying every 14-digit number in descending order) doesn't seem to return in any reasonable amount of time.

    I tried looking at the validator's output for 20 consecutive inputs, but it doesn't seem to give any clear indication of how to hone in on the maximum (or any) valid input in a non-brute-force way.

    Edit: I looked at some reddit threads on this day's puzzle. I have no idea what anyone is talking about. Prooooobably gonna skip this one 😅

    2 votes
  6. Comment on Day 25: Sea Cucumber in ~comp.advent_of_code

    jzimbel
    Link
    Elixir Pretty quick and simple. Posting my messy code as-is and going to bed. Merry Christmas, folks. Part 1 defmodule AdventOfCode.Solution.Year2021.Day25 do def part1(input) do {grid, width,...

    Elixir

    Pretty quick and simple. Posting my messy code as-is and going to bed. Merry Christmas, folks.

    Part 1
    defmodule AdventOfCode.Solution.Year2021.Day25 do
      def part1(input) do
        {grid, width, height} = parse_input(input)
    
        grid
        |> Stream.iterate(&step(&1, width, height))
        |> Stream.chunk_every(2, 1, :discard)
        |> Stream.with_index(1)
        |> Enum.find(&match?({[grid, grid], _}, &1))
        |> elem(1)
      end
    
      def part2(_input) do
      end
    
      defp step(grid, width, height) do
        grid
        |> Enum.into(%{}, fn
          {{x, y}, ?>} = cuke ->
            move_to = {rem(x + 1, width), y}
    
            case grid[move_to] do
              nil -> {move_to, ?>}
              _ -> cuke
            end
    
          down ->
            down
        end)
        |> then(fn grid ->
          Enum.into(grid, %{}, fn
            {{x, y}, ?v} = cuke ->
              move_to = {x, rem(y + 1, height)}
    
              case grid[move_to] do
                nil -> {move_to, ?v}
                _ -> cuke
              end
    
            right ->
              right
          end)
        end)
      end
    
      defp parse_input(input) do
        charlists =
          input
          |> String.split()
          |> Enum.map(&String.to_charlist/1)
    
        height = length(charlists)
        width = length(hd(charlists))
    
        grid =
          for {line, y} <- Enum.with_index(charlists),
              {char, x} <- Enum.with_index(line),
              char != ?.,
              into: %{},
              do: {{x, y}, char}
    
        {grid, width, height}
      end
    end
    
    1 vote
  7. Comment on Day 22: Reactor Reboot in ~comp.advent_of_code

    jzimbel
    (edited )
    Link Parent
    I got part 2! It runs in about 2.6s. Switching to the optimized function for part 1 also cuts down the runtime from 1.4s to 78ms. I absolutely melted my brain trying to get the logic...

    I got part 2! It runs in about 2.6s. Switching to the optimized function for part 1 also cuts down the runtime from 1.4s to 78ms. I absolutely melted my brain trying to get the logic right—visualizing this stuff is a challenge. The code you see is the result of starting with some longer pseudo-code that considers all the possible ways two cuboids can intersect and boiling it down to one function that works on every possible intersection.

    Part 2 (and part 1 using the optimized cuboid-intersecting function)
    defmodule AdventOfCode.Solution.Year2021.Day22 do
      def part1(input) do
        input
        |> parse_input()
        |> Enum.map(fn {action, cuboid} -> {action, clamp(cuboid)} end)
        |> Enum.reduce([], &switch_region/2)
        |> Enum.map(&volume/1)
        |> Enum.sum()
      end
    
      def part2(input) do
        input
        |> parse_input()
        |> Enum.reduce([], &switch_region/2)
        |> Enum.map(&volume/1)
        |> Enum.sum()
      end
    
      defp switch_region({:on, on_cuboid}, cuboids) do
        subtract_from_all([on_cuboid], cuboids) ++ cuboids
      end
    
      defp switch_region({:off, off_cuboid}, cuboids) do
        subtract_from_all(cuboids, [off_cuboid])
      end
    
      # Computes the difference of `lhs_cuboids` - `rhs_cuboids`,
      # splitting `lhs_cuboids` into smaller parts as necessary to preserve
      # it as a list of non-overlapping cuboids.
      defp subtract_from_all(lhs_cuboids, rhs_cuboids) do
        Enum.reduce(rhs_cuboids, lhs_cuboids, fn rhs_cuboid, lhs_acc ->
          Enum.flat_map(lhs_acc, &subtract(&1, rhs_cuboid))
        end)
      end
    
      # Subtracts `rhs_cuboid` from `lhs_cuboid`, splitting `lhs_cuboid` into
      # smaller parts as necessary. Returns result as a list of non-overlapping
      # cuboids.
      defp subtract(lhs_cuboid, rhs_cuboid) do
        if disjoint?(lhs_cuboid, rhs_cuboid) do
          [lhs_cuboid]
        else
          {xmin1..xmax1, ymin1..ymax1 = y, zmin1..zmax1 = z} = lhs_cuboid
          {xmin2..xmax2, ymin2..ymax2, zmin2..zmax2} = rhs_cuboid
    
          clamped_x = max(xmin1, xmin2)..min(xmax1, xmax2)//1
          clamped_y = max(ymin1, ymin2)..min(ymax1, ymax2)//1
    
          [
            {xmin1..(xmin2 - 1)//1, y, z},
            {(xmax2 + 1)..xmax1//1, y, z},
            {clamped_x, ymin1..(ymin2 - 1)//1, z},
            {clamped_x, (ymax2 + 1)..ymax1//1, z},
            {clamped_x, clamped_y, zmin1..(zmin2 - 1)//1},
            {clamped_x, clamped_y, (zmax2 + 1)..zmax1//1}
          ]
          |> Enum.reject(&(volume(&1) == 0))
        end
      end
    
      defp clamp({x_range, y_range, z_range}) do
        {clamp_range(x_range), clamp_range(y_range), clamp_range(z_range)}
      end
    
      defp clamp_range(lower..upper//1) do
        max(lower, -50)..min(upper, 50)//1
      end
    
      defp volume({x_range, y_range, z_range}) do
        Range.size(x_range) * Range.size(y_range) * Range.size(z_range)
      end
    
      defp disjoint?({x_range1, y_range1, z_range1}, {x_range2, y_range2, z_range2}) do
        Range.disjoint?(x_range1, x_range2) or
          Range.disjoint?(y_range1, y_range2) or
          Range.disjoint?(z_range1, z_range2)
      end
    
      defp parse_input(input) do
        input
        |> String.split("\n", trim: true)
        |> Enum.map(&parse_line/1)
      end
    
      defp parse_line(line) do
        line
        |> String.split()
        |> then(fn [on_off, ranges] ->
          {String.to_existing_atom(on_off), parse_ranges(ranges)}
        end)
      end
    
      defp parse_ranges(ranges) do
        ~r/(-?\d+)..(-?\d+)/
        |> Regex.scan(ranges, capture: :all_but_first)
        |> Enum.map(fn [lower, upper] ->
          String.to_integer(lower)..String.to_integer(upper)
        end)
        |> List.to_tuple()
      end
    end
    
    2 votes
  8. Comment on Day 22: Reactor Reboot in ~comp.advent_of_code

    jzimbel
    (edited )
    Link
    Elixir Part 1 was easy-peasy, but those huge numbers in the puzzle input definitely struck fear into my heart. I did some thinking on paper for part 2, and I think I know what I need to do for an...

    Elixir
    Part 1 was easy-peasy, but those huge numbers in the puzzle input definitely struck fear into my heart.

    I did some thinking on paper for part 2, and I think I know what I need to do for an optimized solution, but... yikes, it seems like actually doing it will be really challenging. Here's what I've got for part 1, at least. Also a small hint at how I'm planning to implement part 2 from the function stubs/argument names.

    My naive solution for part 1 runs in about 1.4s.

    Part 1
    defmodule AdventOfCode.Solution.Year2021.Day22 do
      def part1(input) do
        input
        |> parse_input()
        |> Enum.map(fn {action, {x_range, y_range, z_range}} ->
          {action, {clamp_range(x_range), clamp_range(y_range), clamp_range(z_range)}}
        end)
        |> Enum.reduce(MapSet.new(), &switch_region_naive/2)
        |> MapSet.size()
      end
    
      def part2(input) do
        input
        |> parse_input()
        |> Enum.reduce([], &switch_region_optimized/2)
        |> Enum.map(fn {xmin..xmax, ymin..ymax, zmin..zmax} ->
          (xmax - xmin) * (ymax - ymin) * (zmax - zmin)
        end)
        |> Enum.sum()
      end
    
      for {action, set_fn} <- [on: &MapSet.union/2, off: &MapSet.difference/2] do
        defp switch_region_naive({unquote(action), {x_range, y_range, z_range}}, on_now) do
          unquote(set_fn).(
            on_now,
            for(
              x <- x_range,
              y <- y_range,
              z <- z_range,
              into: MapSet.new(),
              do: {x, y, z}
            )
          )
        end
      end
    
      defp switch_region_optimized({_action, _ranges}, _non_overlapping_ranges) do
        :do_the_thing!
      end
    
      defp clamp_range(lower..upper//1) do
        max(lower, -50)..min(upper, 50)//1
      end
    
      defp parse_input(input) do
        input
        |> String.split("\n", trim: true)
        |> Enum.map(&parse_line/1)
      end
    
      defp parse_line(line) do
        line
        |> String.split()
        |> then(fn [on_off, ranges] ->
          {String.to_existing_atom(on_off), parse_ranges(ranges)}
        end)
      end
    
      defp parse_ranges(ranges) do
        ~r/(-?\d+)..(-?\d+)/
        |> Regex.scan(ranges, capture: :all_but_first)
        |> Enum.map(fn [lower, upper] ->
          String.to_integer(lower)..String.to_integer(upper)
        end)
        |> List.to_tuple()
      end
    end
    
    Part 2 thoughts

    Whenever these optimization problems come up, the solution is usually to find a way to model the problem with numbers alone. I think this one is no different.

    It looks like instead of using a set or similar structure to track every single 3d point that is currently "on", we need to keep the representation closer to how it's represented in the puzzle input—a list of groups of 3 ranges, each representing a cuboid.

    I'm trying to figure out how to decompose the result of each action into a list of non-overlapping cuboids. Keeping them from overlapping is the important part—it makes calculating the total volume at the end as simple as summing the volumes of each cuboid. I found this SO question on the topic, and it looks pretty involved in two dimensions, let alone 3...

    Edit: This problem is slightly more manageable than the SO post I linked because if you set it up right, you're only ever checking the overlap of two cuboids at a time and every operation starts and ends with non-overlapping cuboids. There are still a lot of different cases to consider in checking each pair of cuboids, though.

    1 vote
  9. Comment on Day 18: Snailfish in ~comp.advent_of_code

    jzimbel
    Link
    Elixir I thought this one was fun! I had a few bugs to iron out before I got it working, but the abundance of examples to test against really helped make them easy to find. Both parts defmodule...

    Elixir

    I thought this one was fun! I had a few bugs to iron out before I got it working, but the abundance of examples to test against really helped make them easy to find.

    Both parts
    defmodule AdventOfCode.Solution.Year2021.Day18 do
      def part1(input) do
        input
        |> parse_input()
        |> Enum.reduce(&add(&2, &1))
        |> magnitude()
      end
    
      def part2(input) do
        input
        |> parse_input()
        |> then(fn numbers ->
          indexed_numbers = Enum.with_index(numbers)
    
          for {sn1, i} <- indexed_numbers,
              {sn2, j} <- indexed_numbers,
              i != j,
              do: {sn1, sn2}
        end)
        |> Enum.map(fn {sn1, sn2} -> add(sn1, sn2) end)
        |> Enum.map(&magnitude/1)
        |> Enum.max()
      end
    
      def add(sn1, sn2) do
        reduce({sn1, sn2})
      end
    
      def reduce(sn) do
        with :ok <- try_explode(sn),
             :ok <- try_split(sn) do
          sn
        else
          {:explode, sn, _, _} -> reduce(sn)
          {:split, sn} -> reduce(sn)
        end
      end
    
      def try_explode(sn, level \\ 0)
    
      def try_explode({l, r}, 4) do
        {:explode, 0, {l, false}, {r, false}}
      end
    
      def try_explode({l, r}, level) do
        with {:l, :ok} <- {:l, try_explode(l, level + 1)},
             {:r, :ok} <- {:r, try_explode(r, level + 1)} do
          :ok
        else
          {:l, {:explode, l, l_explode, r_explode}} ->
            {r, r_explode} = put_explode(r, r_explode, :l)
            {:explode, {l, r}, l_explode, r_explode}
    
          {:r, {:explode, r, l_explode, r_explode}} ->
            {l, l_explode} = put_explode(l, l_explode, :r)
            {:explode, {l, r}, l_explode, r_explode}
        end
      end
    
      def try_explode(_n, _level), do: :ok
    
      def try_split({l, r}) do
        with {:l, :ok} <- {:l, try_split(l)},
             {:r, :ok} <- {:r, try_split(r)} do
          :ok
        else
          {:l, {:split, l}} -> {:split, {l, r}}
          {:r, {:split, r}} -> {:split, {l, r}}
        end
      end
    
      def try_split(n) when n >= 10 do
        {:split, {div(n, 2), ceil(n / 2)}}
      end
    
      def try_split(_n), do: :ok
    
      def put_explode(sn, {explode_n, true}, _), do: {sn, {explode_n, true}}
    
      def put_explode(sn, {explode_n, false}, side) do
        put_explode(sn, explode_n, side)
      end
    
      def put_explode({l, r}, explode_n, :l) do
        {l, record} = put_explode(l, explode_n, :l)
        {{l, r}, record}
      end
    
      def put_explode({l, r}, explode_n, :r) do
        {r, record} = put_explode(r, explode_n, :r)
        {{l, r}, record}
      end
    
      def put_explode(n, explode_n, _), do: {n + explode_n, {explode_n, true}}
    
      def magnitude({l, r}), do: 3 * magnitude(l) + 2 * magnitude(r)
      def magnitude(n), do: n
    
      def parse_input(input) do
        if safe?(input) do
          input
          |> String.replace(~w|[ ]|, fn
            "[" -> "{"
            "]" -> "}"
          end)
          |> String.split("\n", trim: true)
          |> Enum.map(&Code.eval_string/1)
          |> Enum.map(&elem(&1, 0))
        else
          raise "I ain't eval-ing that"
        end
      end
    
      def safe?(input), do: Regex.match?(~r/^[\[\]\d,\n]+$/, input)
    end
    
    1 vote
  10. Comment on Day 17: Trick Shot in ~comp.advent_of_code

    jzimbel
    (edited )
    Link
    Elixir Catching up after a little break! This one wasn't too hard. It looked like part 1 could be solved with basic math since we only really care about the y position, but I went ahead and set up...

    Elixir

    Catching up after a little break! This one wasn't too hard. It looked like part 1 could be solved with basic math since we only really care about the y position, but I went ahead and set up code to simulate shots assuming we would need to do more than the bare minimum for part 2. That worked out well for me.

    Side note: the Elixir syntax highlighter used here doesn't distinguish used from unused match variables—any variable starting with a _ is unused and is only included so that the pattern on the left matches the expression on the right. These variables are normally grayed out in most editors.

    Both parts

    simulate_shot/2 returns a stream that emits the position at each step until (and excluding) the first position past the target region.

    The rest of the problem boils down to simulating a bunch of shots and querying the results for the first one that matches a condition (goes from a position at or above y=0 to a position inside the y-bounds of the target in one step), or counting up all of the results that match a condition (hits the target).

    defmodule AdventOfCode.Solution.Year2021.Day17 do
      def part1(input) do
        {_x_target, ymin..ymax} = target = parse_input(input)
    
        -ymin..-ymax//-1
        |> Stream.map(&simulate_shot({0, &1}, target))
        |> Enum.find(&first_step_hit?(&1, target))
        |> Stream.map(fn {_x, y} -> y end)
        |> Enum.max()
      end
    
      def part2(input) do
        {_xmin..xmax, ymin.._ymax} = target = parse_input(input)
    
        for(vx <- 0..xmax, vy <- ymin..-ymin, do: {vx, vy})
        |> Stream.map(&simulate_shot(&1, target))
        |> Enum.count(&hits_target?(&1, target))
      end
    
      defp simulate_shot(v, target) do
        {v, {0, 0}}
        |> Stream.iterate(&step/1)
        |> Stream.map(&elem(&1, 1))
        |> Stream.take_while(&in_simulation_bounds?(&1, target))
      end
    
      defp step({{vx, vy}, {x, y}}) do
        {{vx - unit(vx), vy - 1}, {x + vx, y + vy}}
      end
    
      defp in_simulation_bounds?({x, y}, {xmin..xmax, ymin.._ymax}) do
        x in min(0, xmin)..max(0, xmax) and y >= ymin
      end
    
      # Detects shots where the probe hits the target's y bounds on its first step below y=0
      defp first_step_hit?(shot, {_x_target, y_target}) do
        shot
        |> Stream.chunk_every(2, 1, :discard)
        |> Enum.any?(fn [{_x1, y1}, {_x2, y2}] -> y1 >= 0 and y2 in y_target end)
      end
    
      defp hits_target?(shot, target) do
        Enum.any?(shot, &on_target?(&1, target))
      end
    
      defp on_target?({x, y}, {x_target, y_target}) do
        x in x_target and y in y_target
      end
    
      defp parse_input(input) do
        ~r/(-?\d+)\.\.(-?\d+)/
        |> Regex.scan(input, capture: :all_but_first)
        |> List.flatten()
        |> Enum.map(&String.to_integer/1)
        |> then(fn [xmin, xmax, ymin, ymax] ->
          {xmin..xmax, ymin..ymax}
        end)
      end
    
      defp unit(0), do: 0
      defp unit(n) when n > 0, do: 1
      defp unit(_n), do: -1
    end
    
    1 vote
  11. Comment on Day 16: Packet Decoder in ~comp.advent_of_code

    jzimbel
    (edited )
    Link
    Elixir Fun! This problem is something that Elixir (or really Erlang, which Elixir is built on top of/interops with) has batteries-included support for. It has a bitstring type and a whole special...

    Elixir
    Fun! This problem is something that Elixir (or really Erlang, which Elixir is built on top of/interops with) has batteries-included support for. It has a bitstring type and a whole special subset of pattern matching syntax that makes for fast, concise, declarative code for parsing bitstrings. This strong support exists because Erlang was purpose-built for low latency communication gateways and distributed systems.

    Runtime: My solution runs in about 260 - 270μs for each part.

    Both parts
    defmodule AdventOfCode.Solution.Year2021.Day16 do
      def part1(input) do
        input
        |> String.trim()
        |> hex_to_bitstring()
        |> parse()
        |> version_sum()
      end
    
      def part2(input) do
        input
        |> String.trim()
        |> hex_to_bitstring()
        |> parse()
        |> eval()
      end
    
      defp parse(packet) do
        elem(do_parse(packet), 0)
      end
    
      defp do_parse(packet, packets_remaining \\ 1, parsed \\ [])
    
      # base case for parsing a specific length of bits
      defp do_parse(<<>>, _, parsed) do
        {Enum.reverse(parsed), <<>>}
      end
    
      # base case for parsing a specific number of packets
      defp do_parse(rest, 0, parsed) do
        {Enum.reverse(parsed), rest}
      end
    
      # literal
      defp do_parse(<<version::3, 4::3, literal::bits>>, packets_remaining, parsed) do
        {n, rest} = parse_literal(literal)
        do_parse(rest, packets_remaining - 1, [{:literal, n, version} | parsed])
      end
    
      # operator - total bit length
      defp do_parse(
             <<version::3, type::3, 0::1, len::15, packets::bits-size(len), rest::bits>>,
             packets_remaining,
             parsed
           ) do
        {parsed_sub_packets, <<>>} = do_parse(packets, -1)
        do_parse(rest, packets_remaining - 1, [{get_op(type), parsed_sub_packets, version} | parsed])
      end
    
      # operator - packet count
      defp do_parse(
             <<version::3, type::3, 1::1, sub_packet_count::11, sub_packets::bits>>,
             packets_remaining,
             parsed
           ) do
        {parsed_sub_packets, rest} = do_parse(sub_packets, sub_packet_count)
        do_parse(rest, packets_remaining - 1, [{get_op(type), parsed_sub_packets, version} | parsed])
      end
    
      defp parse_literal(literal, acc \\ [])
    
      defp parse_literal(<<0::1, last_part::4, rest::bits>>, acc) do
        literal_bits =
          [<<last_part::4>> | acc]
          |> Enum.reverse()
          |> :erlang.list_to_bitstring()
    
        n_size = bit_size(literal_bits)
    
        <<n::size(n_size)>> = literal_bits
    
        {n, rest}
      end
    
      defp parse_literal(<<1::1, n_part::4, rest::bits>>, acc) do
        parse_literal(rest, [<<n_part::4>> | acc])
      end
    
      for {op, id} <- [sum: 0, product: 1, min: 2, max: 3, gt: 5, lt: 6, eq: 7] do
        defp get_op(unquote(id)), do: unquote(op)
      end
    
      defp version_sum(parsed) do
        Enum.reduce(parsed, 0, fn
          {:literal, _n, version}, acc -> acc + version
          {_op, nested, version}, acc -> acc + version + version_sum(nested)
        end)
      end
    
      defp eval([expr]), do: eval(expr)
    
      defp eval({:literal, n, _}), do: n
    
      for {op, math_fun} <- [
            sum: &Enum.sum/1,
            product: &Enum.product/1,
            min: &Enum.min/1,
            max: &Enum.max/1
          ] do
        defp eval({unquote(op), args, _version}), do: unquote(math_fun).(Enum.map(args, &eval/1))
      end
    
      for {op, comp_fun} <- [gt: &Kernel.>/2, lt: &Kernel.</2, eq: &Kernel.==/2] do
        defp eval({unquote(op), [arg1, arg2], _version}),
          do: if(unquote(comp_fun).(eval(arg1), eval(arg2)), do: 1, else: 0)
      end
    
      defp hex_to_bitstring(input, acc \\ [])
    
      defp hex_to_bitstring(<<>>, acc) do
        acc
        |> Enum.reverse()
        |> :erlang.list_to_bitstring()
      end
    
      defp hex_to_bitstring(<<hex_digit::bytes-size(1), rest::bytes>>, acc) do
        hex_to_bitstring(rest, [<<String.to_integer(hex_digit, 16)::4>> | acc])
      end
    end
    
    2 votes
  12. Comment on Day 15: Chiton in ~comp.advent_of_code

    jzimbel
    Link
    Elixir It terminates, it passes the provided test cases and gives the correct answer for part 1, but it apparently gives an incorrect (too low) answer for my real part 2 input. I'm stumped. I was...

    Elixir

    It terminates, it passes the provided test cases and gives the correct answer for part 1, but it apparently gives an incorrect (too low) answer for my real part 2 input. I'm stumped.

    I was not interested in writing an efficient Dijkstra's algorithm implementation since I've already done that at least once in college, so I grabbed a libgraph library that does it for me. It's possible that the bug lies somewhere within that library, idk.

    Really not a fan of this puzzle.

    Part 1 and (buggy, I guess?) part 2
    defmodule AdventOfCode.Solution.Year2021.Day15 do
      alias AdventOfCode.CharGrid
      alias Graph
    
      def part1(input) do
        input
        |> CharGrid.from_input()
        |> make_numeric()
        |> shortest_path_weight()
      end
    
      def part2(input) do
        input
        |> CharGrid.from_input()
        |> make_numeric()
        |> embiggen()
        |> shortest_path_weight()
      end
    
      defp shortest_path_weight(grid) do
        grid
        |> build_graph()
        |> Graph.dijkstra({0, 0}, {grid.width - 1, grid.height - 1})
        |> Enum.drop(1)
        |> Enum.map(&CharGrid.at(grid, &1))
        |> Enum.sum()
      end
    
      defp build_graph(grid) do
        Graph.new()
        |> Graph.add_edges(
          Stream.flat_map(grid.grid, fn {coords, _} ->
            grid
            |> CharGrid.adjacent_cells(coords, :cardinal)
            |> Enum.map(fn {neighbor_coords, neighbor_value} ->
              {coords, neighbor_coords, weight: neighbor_value}
            end)
          end)
        )
      end
    
      defp embiggen(grid) do
        grid
        |> CharGrid.to_list()
        |> Stream.flat_map(fn {{x, y}, value} ->
          for x_offset <- 0..4,
              y_offset <- 0..4 do
            {
              {x + grid.width * x_offset, y + grid.height * y_offset},
              rem(value + x_offset + y_offset - 1, 9) + 1
            }
          end
        end)
        |> then(&%CharGrid{
          grid: Enum.into(&1, %{}),
          width: grid.width * 5,
          height: grid.height * 5
        })
      end
    
      defp make_numeric(grid) do
        CharGrid.map(grid, fn {_, char_value} -> char_value - ?0 end)
      end
    end
    
    1 vote
  13. Comment on Day 14: Extended Polymerization in ~comp.advent_of_code

    jzimbel
    Link
    Elixir Does it work for both parts? Yes. Does it run quickly? Yes—3.5ms for part 2. Will I explain my nightmare code? ...No. Both parts defmodule AdventOfCode.Solution.Year2021.Day14 do def...

    Elixir

    Does it work for both parts? Yes.
    Does it run quickly? Yes—3.5ms for part 2.
    Will I explain my nightmare code? ...No.

    Both parts
    defmodule AdventOfCode.Solution.Year2021.Day14 do
      def part1(input) do
        measure_frequencies_at(input, 10)
      end
    
      def part2(input) do
        measure_frequencies_at(input, 40)
      end
    
      defp measure_frequencies_at(input, step) do
        {frequencies, rules, first_char, last_char} = parse_input(input)
    
        frequencies
        |> Stream.iterate(&insert(&1, rules))
        |> Enum.at(step)
        |> bigram_frequencies_to_unigram_frequencies(first_char, last_char)
        |> Map.values()
        |> Enum.min_max()
        |> then(fn {least_common_count, most_common_count} ->
          most_common_count - least_common_count
        end)
      end
    
      defp insert(freqs, rules) do
        freqs
        |> Enum.flat_map(fn {[char_a, char_b] = chars, count} ->
          rules[chars]
          |> then(fn insertion ->
            [
              {[char_a, insertion], count},
              {[insertion, char_b], count}
            ]
          end)
        end)
        |> Enum.group_by(&elem(&1, 0), &elem(&1, 1))
        |> Enum.map(fn {k, counts} -> {k, Enum.sum(counts)} end)
      end
    
      defp bigram_frequencies_to_unigram_frequencies(freqs, first_char, last_char) do
        freqs
        |> Enum.flat_map(fn {[char_a, char_b], freq} ->
          [
            {char_a, freq},
            {char_b, freq}
          ]
        end)
        |> Enum.group_by(&elem(&1, 0), &elem(&1, 1))
        |> Enum.map(fn {k, counts} -> {k, Enum.sum(counts)} end)
        |> Enum.into(%{}, fn
          {char, doubled_freq} -> {char, ceil(doubled_freq / 2)}
        end)
        |> then(fn freqs ->
          if first_char == last_char do
            Map.update!(freqs, first_char, &(&1 + 1))
          else
            freqs
          end
        end)
      end
    
      defp parse_input(input) do
        [first_char | rest_chars] = String.to_charlist(input)
        last_char = List.last(rest_chars)
    
        input
        |> String.split("\n\n", trim: true)
        |> then(fn [polymer, rules] ->
          {bigram_frequencies(polymer), parse_rules(rules), first_char, last_char}
        end)
      end
    
      defp bigram_frequencies(string) do
        string
        |> String.to_charlist()
        |> Enum.chunk_every(2, 1, :discard)
        |> Enum.frequencies()
      end
    
      defp parse_rules(rules) do
        rules
        |> String.split("\n", trim: true)
        |> Enum.map(fn <<char_a, char_b, " -> ", insertion>> -> {[char_a, char_b], insertion} end)
        |> Enum.into(%{})
      end
    end
    
    2 votes
  14. Comment on Day 13: Transparent Origami in ~comp.advent_of_code

    jzimbel
    Link
    Elixir This one was pretty straightforward, and luckily I had defined a CharSpace module while doing last year's puzzles which made it pretty simple to print out the final result. I'm not going to...

    Elixir

    This one was pretty straightforward, and luckily I had defined a CharSpace module while doing last year's puzzles which made it pretty simple to print out the final result. I'm not going to include that code because it's really over-engineered for the relatively basic needs of this specific puzzle (needed to make it generalized for up to 4 dimensions for last year's puzzle), so just know that it's a thing that can take a list of coordinate pairs and produce a string that "draws" them on an appropriately-sized canvas.

    I've found it really helpful/intuitive to use streams whenever a puzzle involves iteration over the initial input. At the top level, this approach transforms the problem into simply reading values from an enumerable. For this problem, the appropriate stream function to do the fold iterations was Stream.unfold/2, which gave me a chuckle.

    Both parts
    defmodule AdventOfCode.Solution.Year2021.Day13 do
      alias AdventOfCode.CharSpace.TwoDim
    
      def part1(input) do
        input
        |> parse_input()
        |> stream_folds()
        |> Enum.at(1)
        |> MapSet.size()
      end
    
      def part2(input) do
        input
        |> parse_input()
        |> stream_folds()
        |> Enum.at(-1)
        |> TwoDim.from_coords_list()
        |> to_string()
      end
    
      defp stream_folds(initial_data) do
        Stream.unfold(initial_data, fn
          nil ->
            nil
    
          {coords_set, []} ->
            {coords_set, nil}
    
          {coords_set, [instruction | instructions]} ->
            {coords_set, {fold(coords_set, instruction), instructions}}
        end)
      end
    
      defp fold(coords_set, {:left, fold_x}) do
        {to_fold, to_remain} =
          coords_set
          |> Enum.reject(fn {x, _} -> x == fold_x end)
          |> Enum.split_with(fn {x, _} -> x > fold_x end)
    
        to_fold
        |> MapSet.new(fn {x, y} -> {2 * fold_x - x, y} end)
        |> MapSet.union(MapSet.new(to_remain))
      end
    
      defp fold(coords_set, {:up, fold_y}) do
        {to_fold, to_remain} =
          coords_set
          |> Enum.reject(fn {_, y} -> y == fold_y end)
          |> Enum.split_with(fn {_, y} -> y > fold_y end)
    
        to_fold
        |> MapSet.new(fn {x, y} -> {x, 2 * fold_y - y} end)
        |> MapSet.union(MapSet.new(to_remain))
      end
    
      defp parse_input(input) do
        [coords, instructions] = String.split(input, "\n\n", trim: true)
    
        {parse_coords_set(coords), parse_instructions(instructions)}
      end
    
      defp parse_coords_set(coords) do
        coords
        |> String.split("\n", trim: true)
        |> MapSet.new(fn line ->
          ~r/(\d+),(\d+)/
          |> Regex.run(line, capture: :all_but_first)
          |> Enum.map(&String.to_integer/1)
          |> List.to_tuple()
        end)
      end
    
      defp parse_instructions(instructions) do
        instructions
        |> String.split("\n", trim: true)
        |> Enum.map(fn line ->
          ~r/(x|y)=(\d+)/
          |> Regex.run(line, capture: :all_but_first)
          |> then(fn
            ["x", n] -> {:left, String.to_integer(n)}
            ["y", n] -> {:up, String.to_integer(n)}
          end)
        end)
      end
    end
    
    4 votes
  15. Comment on Day 12: Passage Pathing in ~comp.advent_of_code

    jzimbel
    Link
    Elixir I really struggled with debugging my part 2 solution, and in the end it came down to me not updating my visited counts at the right time. I wasn't adding the current cave to the visited...

    Elixir

    I really struggled with debugging my part 2 solution, and in the end it came down to me not updating my visited counts at the right time. I wasn't adding the current cave to the visited counts before consulting them to determine which of the next caves were visitable, which meant if the current cave is a small one being visited the second time, and is the first instance of a small cave being visited a second time, a cave right after it might get its second visit as well. Subtle and maddening. 😵‍💫

    Besides that... I'm happy with my approach, which reuses the same recursive function to count paths, with parts 1 and 2 just passing it a different filter function to determine which of a cave's connections are visitable based on the current visited counts.

    There's some ugly and repetitive binary pattern matching (e.g. <<first_char, _rest::binary>> paired with the guard when first_char in ?a..?z) to determine whether a cave is small, but when I tried replacing it with a small_cave?/1 function call, the run time tripled from ~.5 sec to 1.5! I think I could do more up-front work to tag the different types of caves when parsing the input, but I'm calling it quits for now.

    Both parts
    defmodule AdventOfCode.Solution.Year2021.Day12 do
      def part1(input) do
        input
        |> parse_cave_map()
        |> count_paths(&visitable_p1?/2)
      end
    
      def part2(input) do
        input
        |> parse_cave_map()
        |> count_paths(&visitable_p2?/2)
      end
    
      defp count_paths(cave \\ "start", visit_counts \\ %{}, cave_map, visitable_fn)
    
      defp count_paths("end", _, _, _), do: 1
    
      defp count_paths(cave, visit_counts, cave_map, visitable_fn) do
        visit_counts = Map.update(visit_counts, cave, 1, &(&1 + 1))
    
        cave_map[cave]
        |> Enum.filter(&visitable_fn.(&1, visit_counts))
        |> Enum.map(&count_paths(&1, visit_counts, cave_map, visitable_fn))
        |> Enum.sum()
      end
    
      # Part 1 visitable checker
      defp visitable_p1?(<<first_char, _rest::binary>> = small_cave, visit_counts)
           when first_char in ?a..?z do
        Map.get(visit_counts, small_cave, 0) == 0
      end
    
      defp visitable_p1?(_cave, _visit_counts), do: true
    
      # Part 2 visitable checker
      defp visitable_p2?(terminus, visit_counts) when terminus in ~w[start end] do
        Map.get(visit_counts, terminus, 0) == 0
      end
    
      defp visitable_p2?(<<first_char, _rest::binary>> = small_cave, visit_counts)
           when first_char in ?a..?z do
        case Map.get(visit_counts, small_cave, 0) do
          0 -> true
          1 -> no_small_caves_visited_twice?(visit_counts)
          _ -> false
        end
      end
    
      defp visitable_p2?(_cave, _visit_counts), do: true
    
      defp no_small_caves_visited_twice?(visit_counts) do
        visit_counts
        |> Enum.filter(fn {cave, _} -> small_cave?(cave) end)
        |> Enum.all?(fn {_, visit_count} -> visit_count < 2 end)
      end
    
      defp small_cave?(<<first_char, _rest::binary>>), do: first_char in ?a..?z
    
      defp parse_cave_map(input) do
        input
        |> String.split("\n", trim: true)
        |> Enum.map(&Regex.run(~r/(\w+)-(\w+)/, &1, capture: :all_but_first))
        |> Enum.reduce(%{}, fn [cave1, cave2], cave_map ->
          cave_map
          |> Map.update(cave1, [cave2], fn reachable -> [cave2 | reachable] end)
          |> Map.update(cave2, [cave1], fn reachable -> [cave1 | reachable] end)
        end)
      end
    end
    
    1 vote
  16. Comment on Day 11: Dumbo Octopus in ~comp.advent_of_code

    jzimbel
    (edited )
    Link
    Elixir, with concurrency I think I made this much more difficult for myself, but I learned a lot. I decided to go all in on concurrency and an actor model approach as a learning exercise—I set up...

    Elixir, with concurrency

    I think I made this much more difficult for myself, but I learned a lot. I decided to go all in on concurrency and an actor model approach as a learning exercise—I set up agents for each octopus and had them passing messages to one another to simulate the step-by-step level increment and flashes. I also set up a "singleton" counter agent for them to report their flashes to.

    The hardest part to code was synchronizing the parts of each step, since during flash propagation each agent kind of just does its own thing without knowing whether it will receive another "flash" message from a neighbor later on. I ended up accomplishing that with a function (block_until_step_is_done/1), called from the top-level solution process, that repeatedly asks each octopus agent whether it has any messages left in its process mailbox. Probably not the best approach? But it was the best I could come up with.

    I also had to explicitly separate the sub-step of the initial level increment from the cascading increments resulting from neighboring flashes. During each step, a message is broadcast to all octopus agents telling them to increment their level, and only once that message has been sent, they get another message telling them to begin flashing.

    Runtime: On a 4-core machine, part 1 runs in about 185ms and part 2 in 750ms; with 6 cores, 85ms and 320ms. I'd be really curious how this compares to others' solutions!

    FlashCounter agent
    defmodule AdventOfCode.Solution.Year2021.Day11.FlashCounter do
      use Agent
    
      def start_link do
        Agent.start_link(fn -> 0 end, name: __MODULE__)
      end
    
      def stop do
        Agent.stop(__MODULE__)
      end
    
      def increment do
        Agent.cast(__MODULE__, &(&1 + 1))
      end
    
      def value do
        Agent.get(__MODULE__, &Function.identity/1)
      end
    end
    
    Octopus agent
    defmodule AdventOfCode.Solution.Year2021.Day11.Octopus do
      use Agent
    
      alias AdventOfCode.Solution.Year2021.Day11.FlashCounter
    
      defstruct level: 0,
                neighbors: []
    
      def start_link(coords, level, neighbors) do
        Agent.start_link(
          fn -> %__MODULE__{level: level, neighbors: neighbors} end,
          name: agent_name(coords)
        )
      end
    
      def stop(coords) do
        Agent.stop(agent_name(coords))
      end
    
      # Use Agent.update/2 for tick increments so that the call blocks until
      # agent process has received the message and updated its state
     def increment(coords, :tick) do
        Agent.update(agent_name(coords), &do_increment(&1, :tick))
      end
    
      # Use Agent.cast/2 ("fire-and-forget") for increments caused by
      # neighboring flashes since message order isn't important during this part
      def increment(coords, :flash) do
        Agent.cast(agent_name(coords), &do_increment(&1, :flash))
      end
    
      def maybe_flash(coords) do
        Agent.cast(agent_name(coords), &do_maybe_flash/1)
      end
    
      def done?(coords) do
        Agent.get(agent_name(coords), fn _state -> do_done?() end)
      end
    
      defp do_increment(state, source)
    
      # A tick always increments
      defp do_increment(state, :tick), do: %{state | level: state.level + 1}
    
      # A neighbor's flash has no effect if this octopus has already flashed during this step
      defp do_increment(%{level: 0} = state, :flash), do: state
    
      # A neighbor's flash increments and may cause this octopus to flash
      defp do_increment(state, :flash) do
        do_maybe_flash(%{state | level: state.level + 1})
      end
    
      defp do_maybe_flash(%{level: level} = state) when level > 9 do
        flash(state.neighbors)
        %{state | level: 0}
      end
    
      defp do_maybe_flash(state), do: state
    
      defp flash(neighbors) do
        FlashCounter.increment()
        Enum.each(neighbors, &increment(&1, :flash))
      end
    
      defp do_done? do
        match?({:messages, []}, Process.info(self(), :messages))
      end
    
      defp agent_name(coords), do: {:global, {__MODULE__, coords}}
    end
    
    Solution, both parts
    defmodule AdventOfCode.Solution.Year2021.Day11 do
      alias AdventOfCode.CharGrid
      alias __MODULE__.FlashCounter
      alias __MODULE__.Octopus
    
      def part1(input) do
        octopuses = setup(input)
        result = flash_count_after_step(octopuses, 100)
        teardown(octopuses)
    
        result
      end
    
      def part2(input) do
        octopuses = setup(input)
        result = get_first_synchronized_step(octopuses, length(octopuses))
        teardown(octopuses)
    
        result
      end
    
      defp setup(input) do
        FlashCounter.start_link()
    
        grid = CharGrid.from_input(input)
    
        grid
        |> CharGrid.to_list()
        |> tap(fn cells ->
          Enum.each(cells, fn {coords, level_char} ->
            Octopus.start_link(coords, level_char - ?0, CharGrid.adjacent_coords(grid, coords))
          end)
        end)
        |> Enum.map(fn {coords, _} -> coords end)
      end
    
      defp teardown(octopuses) do
        FlashCounter.stop()
        Enum.each(octopuses, &Octopus.stop/1)
      end
    
      defp flash_count_after_step(octopuses, step, current_step \\ 0)
    
      defp flash_count_after_step(_, step, step), do: FlashCounter.value()
    
      defp flash_count_after_step(octopuses, step, current_step) do
        run_step(octopuses)
    
        flash_count_after_step(octopuses, step, current_step + 1)
      end
    
      defp get_first_synchronized_step(octopuses, expected_flashes, current_step \\ 1) do
        initial_flash_count = FlashCounter.value()
        run_step(octopuses)
        flash_count = FlashCounter.value()
    
        if flash_count - initial_flash_count == expected_flashes do
          current_step
        else
          get_first_synchronized_step(octopuses, expected_flashes, current_step + 1)
        end
      end
    
      defp run_step(octopuses) do
        Enum.each(octopuses, &Octopus.increment(&1, :tick))
        Enum.each(octopuses, &Octopus.maybe_flash/1)
        block_until_step_is_done(octopuses)
      end
    
      defp block_until_step_is_done(octopuses) do
        unless Enum.all?(octopuses, &Octopus.done?/1) do
          block_until_step_is_done(octopuses)
        end
      end
    end
    
    3 votes
  17. Comment on Day 10: Syntax Scoring in ~comp.advent_of_code

    jzimbel
    (edited )
    Link
    Elixir Quick one! My solution is kinda messy, but I'm proud of it. Pattern matching and quasi-macros to generate all the necessary clauses of the validation functions made this quick to implement....

    Elixir

    Quick one! My solution is kinda messy, but I'm proud of it. Pattern matching and quasi-macros to generate all the necessary clauses of the validation functions made this quick to implement.

    I might come back and add some explanation of this code since Elixir syntax is a little... out there compared to the C-like languages, but for now I'm going to bed.

    Both parts
    defmodule AdventOfCode.Solution.Year2021.Day10 do
      def part1(input) do
        input
        |> parse_input()
        |> Enum.map(&corrupted_char/1)
        |> Enum.reject(&is_nil/1)
        |> Enum.map(&corrupted_score/1)
        |> Enum.sum()
      end
    
      def part2(input) do
        input
        |> parse_input()
        |> Enum.map(&incomplete_stack/1)
        |> Enum.reject(&match?('', &1))
        |> Enum.map(&autocomplete_score/1)
        |> Enum.sort()
        |> then(fn scores -> Enum.at(scores, div(length(scores), 2)) end)
      end
    
      defp corrupted_char(line, bracket_stack \\ '')
    
      defp corrupted_char('', _), do: nil
    
      for {open, close} <- [{?(, ?)}, {?[, ?]}, {?{, ?}}, {?<, ?>}] do
        defp corrupted_char([unquote(open) | rest_line], stack) do
          corrupted_char(rest_line, [unquote(close) | stack])
        end
    
        defp corrupted_char([unquote(close) | rest_line], [unquote(close) | rest_stack]) do
          corrupted_char(rest_line, rest_stack)
        end
      end
    
      defp corrupted_char([char | _], _), do: char
    
      defp incomplete_stack(line, bracket_stack \\ '')
    
      defp incomplete_stack('', stack), do: stack
    
      for {open, close} <- [{?(, ?)}, {?[, ?]}, {?{, ?}}, {?<, ?>}] do
        defp incomplete_stack([unquote(open) | rest_line], stack) do
          incomplete_stack(rest_line, [unquote(close) | stack])
        end
    
        defp incomplete_stack([unquote(close) | rest_line], [unquote(close) | rest_stack]) do
          incomplete_stack(rest_line, rest_stack)
        end
      end
    
      defp incomplete_stack([_ | _], _), do: ''
    
      defp corrupted_score(?)), do: 3
      defp corrupted_score(?]), do: 57
      defp corrupted_score(?}), do: 1197
      defp corrupted_score(?>), do: 25137
    
      defp autocomplete_score(stack) do
        Enum.reduce(stack, 0, fn char, acc -> acc * 5 + autocomplete_point_value(char) end)
      end
    
      defp autocomplete_point_value(?)), do: 1
      defp autocomplete_point_value(?]), do: 2
      defp autocomplete_point_value(?}), do: 3
      defp autocomplete_point_value(?>), do: 4
    
      defp parse_input(input) do
        input
        |> String.split("\n", trim: true)
        |> Enum.map(&String.to_charlist/1)
      end
    end
    
    Hint

    Use a stack!

    Edit: each part runs in 130-150 μs, so that's fun :D

    Edit: Explaining some of the syntax for those interested:

    Pattern matching

    Pattern matching is a core concept in Elixir, and allows for pretty concise and expressive code. It's sort of like JavaScript's destructuring on steroids—you can bind nested values in a data structure to variables while also asserting the shape of the structure. Pattern matching is most often done with the case expression, but it's also possible to use directly in function heads and in many other parts of the language.

    At its core, pattern matching lets you write assertive code based on how the data looks, which can be more expressive than in other languages where the main tool at your disposal is if + an expression that evaluates to a boolean.

    data = %{a: %{b: [1, 2, 3]}, c: "foobar"}
    
    case data do
      %{a: %{b: [_ | _], other_key: _}, c: a_string} -> "Won't match because nested map doesn't have :other_key key"
      %{a: [_ | _], c: "foobar"} -> "Won't match because :a key doesn't have a list value"
      %{c: "foo" <> rest_of_string} -> "Matches. rest_of_string is #{rest_of_string} == bar"
    end
    

    Pattern matching
    Case expression

    Multiple function clauses

    In Elixir, much more logic can live in the function head than in a lot of other languages—you can pattern match on the arguments there rather than inside the function body. This allows you to split out different branches of logic of a function into completely separate clauses, changing the mental model of function definitions to be closer to simple mapping operations. "Function f, applied to a value that looks like X, produces X'. Function f, applied to a value that looks like Y, produces Y'. ..."

    Pattern matching is also generally a constant-time operation due to restrictions on the assertions you can make, and when it's done in the function head it gets optimized even more during compilation. This means you can gain a pretty significant performance boost by moving a lot of logic into the function head.

    defmodule MyList do
      def length([]), do: 0
      def length([_element | rest]), do: 1 + length(rest)
    end
    

    Multiple function clauses

    Macros; generating function clauses

    The final piece of the puzzle here is Elixir's metaprogramming facilities. I've only dipped my toes in macros since they come with a lot of caveats, but one relatively safe use is generating a lot of similar function clauses from a list of "seed" values at compile time. This essentially boils down to programmatically generating the patterns instead of writing them all out manually. It also often comes with a big performance boost since the work is done at compile time. You can see an extreme example of this in Elixir's Unicode tokenizer—at compile time, it parses all of the codepoints from a text file and creates a huge number of function clauses.

    defmodule Greeter do
      # Not the best real-world example, but hopefully it gets the point across
      for spanish_name <- ["María", "Juan", "Sofía", "Carlos"] do
        def greet(unquote(spanish_name)), do: "Hola, #{unquote(spanish_name)}"
      end
    
      for english_name <- ["Mary", "John", "Sophia", "Carl"] do
        def greet(unquote(english_name)), do: "Hello, #{unquote(english_name)}"
      end
    end
    
    iex> Greeter.greet("Carlos")
    "Hola, Carlos"
    
    iex> Greeter.greet("Mary")
    "Hello, Mary"
    

    Macros

    Other syntax tidbits

    |>: pipe operator. div(abs(n), 2) == n |> abs() |> div(2)

    &: function capture operator. Captures a named function to be bound to a variable or otherwise passed around. Syntax is &function_name/arity; alternatively it can be used for quickly creating anonymous functions or partial application/closures by using &1, &2 etc as positional arguments: increment = &Kernel.+(&1, 1)

    ?: codepoint operator. Gets the unicode codepoint of a single character. ?a == 97, ?0 == 48

    '': charlist. As opposed to a string "foo", a charlist 'foo' is literally a list of codepoints. 'foo' == [?f, ?o, ?o] == [102, 111, 111]. Great for when you need to work through a string one character at a time.

    _: "any" pattern match. Lets you assert the structure of a piece of data without caring about its value. {:ok, _} = {:ok, 5}

    | (in a list): head/tail separator. Elixir lists are singly-linked lists and the syntax for getting the head and the tail is [head | tail]. [1, 2, 3] == [1 | [2 | [3 | []]]]

    7 votes
  18. Comment on Day 9: Smoke Basin in ~comp.advent_of_code

    jzimbel
    (edited )
    Link
    Elixir I'm using a project that I started last year, so I took advantage of a CharGrid module that I previously defined to parse and work with a grid of characters. It still took a while because...

    Elixir

    I'm using a project that I started last year, so I took advantage of a CharGrid module that I previously defined to parse and work with a grid of characters. It still took a while because 1) I was too stubborn to look up and implement a flood-fill algorithm, and 2) iteratively building a collection while simultaneously changing the collection you're pulling values from does not lend itself to a FP/immutable data structures approach. I had to write two custom recursive functions to accomplish this.

    The first time I ran it, partial basins were getting merged together when they shouldn't, and it took me a while to realize I was doing a set union where I should have been doing an intersection. It solved the puzzle successfully after that.

    Part 2 runs in about 1.3s. I wonder how much faster it would be if I'd used a real flood-fill algorithm...

    Both parts
    defmodule AdventOfCode.Solution.Year2021.Day09 do
      alias AdventOfCode.CharGrid
    
      def part1(input) do
        input
        |> CharGrid.from_input()
        |> local_minima_risk_levels()
        |> Enum.sum()
      end
    
      def part2(input) do
        input
        |> CharGrid.from_input()
        |> get_basins()
        |> Enum.map(&MapSet.size/1)
        |> Enum.sort(:desc)
        |> Enum.take(3)
        |> Enum.product()
      end
    
      defp local_minima_risk_levels(grid) do
        grid
        |> CharGrid.filter_cells(fn {coords, value} ->
          grid
          |> CharGrid.adjacent_values(coords)
          |> Enum.all?(&(value < &1))
        end)
        # codepoint for '0' (aka ?0 in Elixir syntax) == 48;
        # The problem calls for value + 1, so subtract 47 from codepoint
        |> Enum.map(fn {_coords, char} -> char - 47 end)
      end
    
      defp get_basins(grid) do
        grid
        |> CharGrid.to_list()
        |> non_nine_coords()
        |> Enum.map(fn coords ->
          adjacent_non_nines =
            grid
            |> CharGrid.adjacent_cells(coords, :cardinal)
            |> non_nine_coords()
    
          MapSet.new([coords | adjacent_non_nines])
        end)
        |> merge_basins()
      end
    
      defp non_nine_coords(cells) do
        Enum.flat_map(cells, fn
          {_coords, ?9} -> []
          {coords, _value} -> [coords]
        end)
      end
    
      # This is inefficient but I didn't feel like learning
      # how to implement a proper flood-fill algorithm
      defp merge_basins(seed_basins, merged_basins \\ [])
    
      defp merge_basins([], merged_basins), do: merged_basins
    
      defp merge_basins([seed_basin | seed_basins], merged_basins) do
        {remaining, merged_basin} = flood_basin(seed_basin, seed_basins)
    
        merge_basins(remaining, [merged_basin | merged_basins])
      end
    
      @doc """
      This function repeatedly goes through the list of seed basins, merging
      connected ones to the target basin until no more can be merged.
    
      It returns the fully filled basin (a set of coordinate pairs)
      as well as a list of the remaining seed basins that have yet
      to be connected to anything else.
      """
      defp flood_basin(basin, seed_basins) do
        import MapSet, only: [intersection: 2, size: 1, union: 2]
    
        {remaining_seed_basins, new_basin} =
          Enum.flat_map_reduce(seed_basins, basin, fn seed_basin, acc ->
            if size(intersection(acc, seed_basin)) > 0 do
              {[], union(acc, seed_basin)}
            else
              {[seed_basin], acc}
            end
          end)
    
        if size(basin) == size(new_basin) do
          {remaining_seed_basins, new_basin}
        else
          flood_basin(new_basin, remaining_seed_basins)
        end
      end
    end
    
    CharGrid module

    This is me LARPing as a core library contributor 🤓

    defmodule AdventOfCode.CharGrid do
      @moduledoc "Data structure representing a finite grid of characters by a map of {x, y} => char"
    
      alias __MODULE__, as: T
    
      @type t :: %T{
              grid: grid,
              width: non_neg_integer,
              height: non_neg_integer
            }
    
      @typep grid :: %{coordinates => char}
    
      @type coordinates :: {non_neg_integer, non_neg_integer}
    
      @type cell :: {coordinates, char}
    
      @enforce_keys ~w[grid width height]a
      defstruct @enforce_keys
    
      # List of coords that produce the 8 coordinates surrounding a given coord when added to it
      @all_adjacent_deltas for x <- -1..1, y <- -1..1, not (x == 0 and y == 0), do: {x, y}
    
      # List of coords that produce the 4 coordinates horizontally/vertically adjacent to a given coord when added to it
      @cardinal_adjacent_deltas for x <- -1..1, y <- -1..1, abs(x) + abs(y) == 1, do: {x, y}
    
      # List of coords that produce the 4 coordinates diagonally adjacent to a given coord when added to it
      @intercardinal_adjacent_deltas for x <- -1..1, y <- -1..1, abs(x) + abs(y) == 2, do: {x, y}
    
      @adjacency_deltas_by_type %{
        all: @all_adjacent_deltas,
        cardinal: @cardinal_adjacent_deltas,
        intercardinal: @intercardinal_adjacent_deltas
      }
    
      @spec from_input(String.t()) :: t()
      def from_input(input) do
        charlists =
          input
          |> String.split()
          |> Enum.map(&String.to_charlist/1)
    
        height = length(charlists)
        width = length(hd(charlists))
    
        grid =
          for {line, y} <- Enum.with_index(charlists),
              {char, x} <- Enum.with_index(line),
              into: %{} do
            {{x, y}, char}
          end
    
        %T{
          grid: grid,
          width: width,
          height: height
        }
      end
    
      @doc "Gets the value at the given coordinates."
      @spec at(t(), coordinates) :: char | nil
      def at(%T{} = t, coords) do
        t.grid[coords]
      end
    
      @doc "Applies `fun` to each cell to produce a new CharGrid. `fun` must return a char."
      @spec map(t(), (cell -> char)) :: t()
      def map(%T{} = t, fun) do
        %{t | grid: for({coords, _} = cell <- t.grid, into: %{}, do: {coords, fun.(cell)})}
      end
    
      @doc "Converts the grid to a list of cells."
      @spec to_list(t()) :: list(cell)
      def to_list(%T{} = t) do
        Map.to_list(t.grid)
      end
    
      @doc "Returns the number of cells for which `predicate` returns a truthy value."
      @spec count(t(), (cell -> as_boolean(any()))) :: non_neg_integer()
      def count(%T{} = t, predicate) do
        Enum.count(t.grid, predicate)
      end
    
      @doc "Returns the number of cells containing `char`."
      @spec count_chars(t(), char) :: non_neg_integer()
      def count_chars(%T{} = t, char) do
        count(t, fn {_, c} -> c == char end)
      end
    
      @doc "Returns a list of cells for which `predicate` returns a truthy value."
      @spec filter_cells(t(), (cell -> as_boolean(any()))) :: list(cell)
      def filter_cells(%T{} = t, predicate) do
        Enum.filter(t.grid, predicate)
      end
    
      @type adjacency_type :: :all | :cardinal | :intercardinal
    
      @doc """
      Returns a list of cells adjacent to the one at `coords`.
    
      The type of adjacency is determined by the third argument:
    
      - `:all` (default behavior):
        ```
        OOO
        O*O
        OOO
        ```
      - `:cardinal`:
        ```
        .O.
        O*O
        .O.
        ```
      - `:intercardinal`:
        ```
        O.O
        .*.
        O.O
        ```
      """
      @spec adjacent_cells(t(), coordinates, adjacency_type) :: list(cell)
      def adjacent_cells(%T{} = t, coords, adjacency_type \\ :all) do
        @adjacency_deltas_by_type[adjacency_type]
        |> Enum.map(&sum_coordinates(coords, &1))
        |> Enum.map(fn adjacent_coords -> {adjacent_coords, at(t, adjacent_coords)} end)
        |> Enum.reject(fn {_coords, value} -> is_nil(value) end)
      end
    
      @doc """
      Convenience function that behaves the same as `adjacent_cells/3`,
      but returns only the char value of each adjacent cell.
      """
      @spec adjacent_values(t(), coordinates, adjacency_type) :: list(char)
      def adjacent_values(%T{} = t, coords, adjacency_type \\ :all) do
        t
        |> adjacent_cells(coords, adjacency_type)
        |> Enum.map(&elem(&1, 1))
      end
    
      @doc """
      Returns a list of values from the up to 8 cells reachable by a chess queen's move from the
      cell at `coords`.
    
      The optional `empty_char` (default `?.`) dictates which cells are considered unoccupied.
      """
      @spec queen_move_values(t(), coordinates, char) :: list(char)
      def queen_move_values(%T{} = t, coords, empty_char \\ ?.) do
        @all_adjacent_deltas
        |> Enum.map(&find_nonempty_on_line(t, &1, sum_coordinates(coords, &1), empty_char))
        |> Enum.reject(&is_nil/1)
      end
    
      defp find_nonempty_on_line(t, step, coords, empty_char) do
        case at(t, coords) do
          ^empty_char -> find_nonempty_on_line(t, step, sum_coordinates(coords, step), empty_char)
          val -> val
        end
      end
    
      defp sum_coordinates({x1, y1}, {x2, y2}), do: {x1 + x2, y1 + y2}
    end
    
    2 votes