hpr's recent activity

  1. Comment on Day 10: Factory in ~comp.advent_of_code

    hpr
    Link Parent
    I'm in the same boat. My code for part 2 also probably (tm) technically works, but I'm not gonna run it. It takes about 2.5 minutes for the demo input, which spells disaster for my actual input,...

    I'm in the same boat.

    My code for part 2 also probably (tm) technically works, but I'm not gonna run it.
    It takes about 2.5 minutes for the demo input, which spells disaster for my actual input, which not only has ~60 times the machines, but those machines are also of a higher complexity, so the actual processing time would probably be much more than 60 times of the demo data, since the bulk of the processing is in trying all combinations of buttons of size n, especially once n crosses 10.

    Part 1 also took quite a while and actually filled my RAM before I switched from lists to lazy streams.
    I managed to get the first part to ~15 seconds, which is utterly glacial in comparison to what Berdes is putting out, but was a big progress.

    There's still most optimization potential on the table. After being a bit sour on part 2 yesterday, I took it lightly and just tried to do it in any way first. But on the upshot, I got some practice with lazy streams and also with Elixir's very simple concurrent computation, which only took a few changed lines.
    It's fun to see the PC actually quickly use 100% of all cores here, but it's clearly not the way to solve this.

    I think I'll just aim to do part 1 from here on out and try to return with a little time during the holidays to finish the last day 2s without destroying my sleep schedule during the work week.

    2 votes
  2. Comment on Day 10: Factory in ~comp.advent_of_code

    hpr
    Link Parent
    I continue to be in awe of your contributions. Thanks for taking the time to do these little writeups!

    I continue to be in awe of your contributions.
    Thanks for taking the time to do these little writeups!

    1 vote
  3. Comment on Day 8: Playground in ~comp.advent_of_code

    hpr
    (edited )
    Link
    Elixir This one went much smoother for me, but still took me about 2 hours. I'd be curious how much time y'all are spending on these solutions, because I do find it's starting to be a bit of a...

    Elixir

    This one went much smoother for me, but still took me about 2 hours.
    I'd be curious how much time y'all are spending on these solutions, because I do find it's starting to be a bit of a drain now. But then again, it's time well spent learning a new language.

    I had two silly mistakes:

    • At the start, I generated both the distances from junction A to junction B and from junction B to junction A. They would be the same so after sorting I would process them one after the other, effectively only processing half the required elements.
    • Later, when a connection merged two existing circuits, I duplicated the elements of the connection, causing the "size" calculation to be off.

    Part two was super quick, not much to change from part one.

    Both parts
      ### PARSING ###
    
      def parse() do
        parse(AocElixir.read_lines(8))
      end
    
      def line_to_point(line) do
        [x, y, z] = String.split(line, ",") |> Enum.map(&String.to_integer/1)
        {x, y, z}
      end
    
      def parse(lines) do
        lines |> Enum.map(&line_to_point/1)
      end
    
      ### PART ONE ###
      def part1(junctions, num_connections \\ 1000) do
        shortest_distances = shortest_distances(junctions)
    
        connections = Enum.take(shortest_distances, num_connections)
    
        circuits = make_circuits(connections)
    
        circuits
        |> Enum.map(&Enum.count/1)
        |> Enum.sort()
        |> Enum.take(-3)
        |> Enum.product()
      end
    
      def find_remove(enum, condition, results \\ {nil, []})
    
      def find_remove([], _, {element, list}), do: {element, list}
    
      def find_remove([head | tail], condition, {element, list}) do
        if condition.(head) do
          find_remove(tail, condition, {head, list})
        else
          find_remove(tail, condition, {element, [head | list]})
        end
      end
    
      def connect_to_circuits(connection, circuits) do
        {left, right} = connection
    
        {existing_circuit_left, circuits} = find_remove(circuits, &Enum.member?(&1, left))
        {existing_circuit_right, circuits} = find_remove(circuits, &Enum.member?(&1, right))
    
        case {existing_circuit_left, existing_circuit_right} do
          {nil, nil} ->
            [[left, right] | circuits]
    
          {_, nil} ->
            [Enum.uniq([right | existing_circuit_left]) | circuits]
    
          {nil, _} ->
            [Enum.uniq([left | existing_circuit_right]) | circuits]
    
          {_, _} ->
            [Enum.uniq(existing_circuit_left ++ existing_circuit_right) | circuits]
        end
      end
    
      def make_circuits(connections) do
        Enum.reduce(connections, [], &connect_to_circuits/2)
      end
    
      def distance({point_a, point_b}), do: distance(point_a, point_b)
    
      def distance({x1, y1, z1}, {x2, y2, z2}) do
        ((x1 - x2) ** 2 + (y1 - y2) ** 2 + (z1 - z2) ** 2) ** 0.5
      end
    
      def pairs(anys, results \\ [])
      def pairs([], results), do: results
    
      def pairs([head | tail], results) do
        ps = for el <- tail, do: {head, el}
        pairs(tail, ps ++ results)
      end
    
      def shortest_distances(points) do
        points |> pairs() |> Enum.sort_by(&distance/1)
      end
    
      ### PART 2 ###
      def part2(junctions) do
        shortest_distances = shortest_distances(junctions)
        num_junctions = Enum.count(junctions)
    
        last_connection =
          Enum.reduce_while(shortest_distances, [], fn connection, circuits ->
            new_circuits = connect_to_circuits(connection, circuits)
    
            if Enum.empty?(tl(new_circuits)) and Enum.count(hd(new_circuits)) === num_junctions do
              {:halt, connection}
            else
              {:cont, new_circuits}
            end
          end)
    
        {{x1, _y1, _z1}, {x2, _y2, _z2}} = last_connection
        x1 * x2
      end
    
    Benchmarks

    It's... not very quick. But adequate.
    The kinds of things some of y'all are cooking up to my amazement just demonstrate
    how little time I spent on classical algorithms and data structures in my day job.

    Name               ips        average  deviation         median         99th %
    parse          2715.45      0.00037 s    ±13.12%      0.00035 s      0.00060 s
    part one          3.06         0.33 s    ±12.98%         0.32 s         0.41 s
    part two          0.80         1.25 s     ±2.45%         1.26 s         1.29 s
    
    1 vote
  4. Comment on Day 7: Laboratories in ~comp.advent_of_code

    hpr
    Link
    Elixir I played this one with the big handycap of refusing to treat the obvious grid as a grid. This has made it much more lengthy and complex than it should've been, especially now that I look...

    Elixir

    I played this one with the big handycap of refusing to treat the obvious grid as a grid. This has made it much more lengthy and complex than it should've been, especially now that I look y'all's solutions.
    My solution instead scans the input once, looking at two lines at a time. (which obviously would've been just as feasible with a grid too!)

    Then, the chars above each other are compared to then generate a potential output.
    So (top: |, bottom: ^) would generate (left: |, middle: ^, right: |).
    These chunks are then recombined to create the new output line. (By priority: if any potential chunk has a ^, take that, then check for |)

    For the second part, I do the same but also create all permutations of the new line value.
    To make the program go fast enough, I then cull all timelines where the last line is the same and keep a running total of how many source-timelines exist for that state so the count is still accurate. This was clearly necessary to not end up in exponential-hell, memoization would've been an alternative solution.

    I'm not too happy with the runtime, but it's fine after that optimization.

    I'm getting tons of practice thinking in lists, recursion and such, but I don't really enjoy having to track little numeric counters and such through tuple types instead of mutating a sum somewhere.

    Overengineered Code
      ### PARSING ###
    
      def parse() do
        parse(AocElixir.read_lines(7))
      end
    
      def parse(lines) do
        Enum.map(lines, &String.to_charlist/1)
      end
    
      ### PART ONE ###
    
      def sliding_window_chunks(list, num_window_size)
          when is_list(list) and num_window_size > length(list) do
        []
      end
    
      def sliding_window_chunks(list, num_window_size)
          when is_list(list) and num_window_size <= length(list) do
        {initial, rest} = Enum.split(list, num_window_size)
    
        Enum.chunk_while(
          rest,
          initial,
          fn elem, acc ->
            new_acc = tl(acc) ++ [elem]
            emitted_chunk = acc
            {:cont, emitted_chunk, new_acc}
          end,
          # emit the remaining accumulator
          fn acc -> {:cont, acc, []} end
        )
      end
    
      defp process_chunks(chunks, join_fun) do
        {heads, tails} = chunks |> Enum.map(&Enum.split(&1, 1)) |> Enum.unzip()
        heads = heads |> Enum.reject(&Enum.empty?/1)
        tails = tails |> Enum.reject(&Enum.empty?/1)
    
        head = heads |> Enum.map(&hd/1) |> join_fun.()
    
        {head, tails}
      end
    
      defp join_chunks(remaining_chunks, current_chunks, join_fun, results) do
        case {remaining_chunks, current_chunks} do
          {[], []} ->
            results
    
          {[], current_chunks} ->
            {head, tails} = process_chunks(current_chunks, join_fun)
            join_chunks([], tails, join_fun, [head | results])
    
          {[new_chunk | next_remaining_chunks], current_chunks} ->
            current_chunks = [new_chunk | current_chunks]
            {head, tails} = process_chunks(current_chunks, join_fun)
            join_chunks(next_remaining_chunks, tails, join_fun, [head | results])
        end
      end
    
      def join_chunks(chunks, join_fun), do: join_chunks(chunks, [], join_fun, [])
    
      # vals: {upper_char, lower_char}
      def advance_beam(vals) do
        case vals do
          {?S, ?.} -> [?., ?|, ?.]
          {?|, ?^} -> [?|, ?^, ?|]
          {?|, ?.} -> [?., ?|, ?.]
          {?., ?^} -> [?., ?^, ?.]
          {_, ?.} -> [?., ?., ?.]
        end
      end
    
      def choose_char_from_options(list_of_chars) do
        cond do
          Enum.member?(list_of_chars, ?^) -> ?^
          Enum.member?(list_of_chars, ?|) -> ?|
          Enum.member?(list_of_chars, ?.) -> ?.
        end
      end
    
      def simulate_beam(input_lines) do
        input_lines
        |> Enum.reduce([], fn line, result ->
          if Enum.empty?(result) do
            [line | result]
          else
            prev_line = hd(result)
    
            new_line =
              Enum.zip(prev_line, line)
              |> Enum.map(&advance_beam/1)
              |> join_chunks(&choose_char_from_options/1)
              # because we're generating 3-tuples for each char, we need to remove the first and last generated, which are not collapsed by join_chunks
              |> Enum.drop(1)
              # we're building in reverse because of linked lists, so reverse
              |> Enum.reverse()
    
            [tl(new_line) | result]
          end
        end)
        # we're building in reverse because of linked lists, so reverse
        |> Enum.reverse()
      end
    
      def count_splits(input_lines) do
        input_lines
        # columnwise
        |> Enum.zip()
        |> Enum.map(fn col ->
          sliding_window_chunks(Tuple.to_list(col), 2)
          |> Enum.count(fn [upper, lower] -> upper === ?| and lower === ?^ end)
        end)
        |> Enum.sum()
      end
    
      def part1(input_lines) do
        input_lines
        |> simulate_beam()
        |> count_splits()
      end
    
      ### PART 2 ###
    
      # vals: {upper_char, lower_char}
      def advance_quantum_beam(vals) do
        case vals do
          {?S, ?.} -> [[?., ?|, ?.]]
          {?|, ?^} -> [[?|, ?^, ?.], [?., ?^, ?|]]
          {?|, ?.} -> [[?., ?|, ?.]]
          {?., ?^} -> [[?., ?^, ?.]]
          {_, ?.} -> [[?., ?., ?.]]
        end
      end
    
      def permutations(list_of_lists) do
        Enum.reduce(Enum.reverse(list_of_lists), [[]], fn options, permutations ->
          Enum.flat_map(options, fn option ->
            Enum.map(permutations, fn permutation ->
              [option | permutation]
            end)
          end)
        end)
      end
    
      def simulate_quantum_beam(input_lines) do
        input_lines
        |> Enum.reduce([], fn line, results ->
          if Enum.empty?(results) do
            [{[line], 1}]
          else
            new_results =
              Enum.flat_map(results, fn {result_lines, sources} ->
                prev_line = hd(result_lines)
    
                new_quantum_lines =
                  Enum.zip(prev_line, line)
                  |> Enum.map(&advance_quantum_beam/1)
                  |> permutations()
                  |> Enum.map(fn permutation ->
                    permutation
                    |> join_chunks(&choose_char_from_options/1)
                    # because we're generating 3-tuples for each char, we need to remove the first and last generated, which are not collapsed by join_chunks
                    |> Enum.drop(1)
                    # we're building in reverse because of linked lists, so reverse
                    |> Enum.reverse()
                    |> Enum.drop(1)
                  end)
    
                Enum.map(new_quantum_lines, fn new_line -> {[new_line | result_lines], sources} end)
              end)
    
            new_results
            |> Enum.uniq_by(fn {result_lines, _} -> hd(result_lines) end)
            |> Enum.map(fn {result_lines, _} ->
              {result_lines,
               Enum.filter(new_results, fn res -> hd(elem(res, 0)) === hd(result_lines) end)
               |> Enum.sum_by(&elem(&1, 1))}
            end)
          end
        end)
        # we're building in reverse because of linked lists, so reverse
        |> Enum.map(fn {result_lines, sources} -> {Enum.reverse(result_lines), sources} end)
      end
    
      def part2(input) do
        input
        |> simulate_quantum_beam()
        |> Enum.sum_by(fn {_, sources} -> sources end)
      end
    
    Benchmarks
    Name               ips        average  deviation         median         99th %
    parse         14168.17      0.0706 ms    ±33.69%      0.0844 ms       0.116 ms
    part one        174.39        5.73 ms     ±7.58%        5.58 ms        7.01 ms
    part two          3.09      324.02 ms     ±1.01%      323.01 ms      332.19 ms
    
    1 vote
  5. Comment on Day 6: Trash Compactor in ~comp.advent_of_code

    hpr
    Link
    Elixir Both Parts def parse() do parse(AocElixir.read_lines(6)) end def parse_op(op) when op === "+", do: &Kernel.+/2 def parse_op(op) when op === "*", do: &Kernel.*/2 def parse_ops(raw_ops_row),...

    Elixir

    Both Parts
      def parse() do
        parse(AocElixir.read_lines(6))
      end
    
      def parse_op(op) when op === "+", do: &Kernel.+/2
      def parse_op(op) when op === "*", do: &Kernel.*/2
    
      def parse_ops(raw_ops_row), do: raw_ops_row |> String.split() |> Enum.map(&parse_op/1)
    
      def parse(lines) do
        row_num = Enum.count(lines)
        {raw_num_rows, [raw_ops_row]} = Enum.split(lines, row_num - 1)
        {raw_num_rows, parse_ops(raw_ops_row)}
      end
    
      ### PART 1 ###
    
      def parse_str_num(str_num), do: str_num |> String.trim() |> String.to_integer()
    
      def combine(row1, row2, fns) do
        Enum.zip_with([row1, row2, fns], fn [a, b, op] -> op.(a, b) end)
      end
    
      def part1({raw_num_rows, ops}) do
        num_rows =
          Enum.map(
            raw_num_rows,
            fn row -> String.split(row) |> Enum.map(&parse_str_num/1) end
          )
    
        num_rows
        # associate each column with its operation
        |> Enum.reduce(&combine(&1, &2, ops))
        |> Enum.sum()
      end
    
      ### PART 2 ###
    
      def split_on_condition(enum, condition) do
        enum
        |> Enum.chunk_while(
          [],
          fn char, acc ->
            if condition.(char) do
              # emit a chunk with all previous values once condition is hit
              # reverse because cons would otherwise reverse the input
              {:cont, Enum.reverse(acc), []}
            else
              # accumulate non-matching values
              {:cont, [char | acc]}
            end
          end,
          # emit the remaining accumulator
          fn acc -> {:cont, Enum.reverse(acc), []} end
        )
      end
    
      def part2({raw_num_rows, ops}) do
        column_charlists =
          raw_num_rows
          |> Enum.map(&String.to_charlist/1)
          |> Enum.zip()
          |> Enum.map(&Tuple.to_list(&1))
    
        grouped_num_columns =
          column_charlists
          |> split_on_condition(&Enum.all?(&1, fn char -> char === ?\s end))
          |> Enum.map(
            &Enum.map(&1, fn col ->
              col
              |> to_string()
              |> String.trim()
              |> String.to_integer()
            end)
          )
    
        Enum.zip(ops, grouped_num_columns)
        |> Enum.map(fn {op, cols} -> Enum.reduce(cols, op) end)
        |> Enum.sum()
      end
    
    Benchmarks
    Name               ips        average  deviation         median         99th %
    parse          17.70 K       56.49 μs     ±8.84%       55.51 μs       72.43 μs
    part one        1.44 K      696.27 μs    ±12.45%      668.96 μs     1012.20 μs
    part two        0.62 K     1609.27 μs    ±11.51%     1605.36 μs     2125.26 μs
    

    Notes:

    Blegh, this was an odyssey for all the wrong reasons for me.
    I breezed through part one and even though I basically had to rewrite for part two, I quickly had a solution that satisfied the demo.

    BUT IT WOULD NOT PASS.

    Why, why, I lamented. Well it turned out after painstaking searches and debugging, that my program's core logic had no fault! It worked! However, there was a little detail:

    Enum.zip(), which I used to transform the rows of numbers into columns, stops once any of the Enumerables are empty. Which should not be a problem, right? The lines all have the same length! The text file was even properly formatted after all, all nice and tidy!

    Except, in my infinite wisdom, I had given my file-reading-helper a little convenience: It trimmed the input string. Which deleted the trailing newline. Which made the last line shorter by one char. Which meant the last column wasn't being processed.

    Blegh.

    2 votes
  6. Comment on Day 5: Cafeteria in ~comp.advent_of_code

    hpr
    Link Parent
    Strange indeed, the more you know! Re: Re: ... types ...: Yeah I think for puzzles like this, that's actually pretty fine. Indeed, it's probably much more convenient than usual OOP type-handling....

    I wonder why they don't just add a separate clause with the faster logic for the most commonly used step size of 1.

    Strange indeed, the more you know!

    Re: Re: ... types ...: Yeah I think for puzzles like this, that's actually pretty fine.
    Indeed, it's probably much more convenient than usual OOP type-handling. I'm really enjoying the multiple function definitions to replace if-statement / pattern matching tbh. (Although as a beginner I've already encountered some weirdness around the restrictions of what can be / can not be used as a guard.)

    I'm wondering how this would pan out in large applications, where (in my experience) it's very handy to restrict what kind of operations you can do on certain data to prevent "illegal" state. In the same vein, I do wonder how I'd deal with the dynamic typing in that kind of environment once you tackle some larger refactoring. (You see, I've only professionally used static languages)

    When I became interested in a BEAM-language, I first looked at Gleam as a static language, but Elixir simply seemed more mature.

    1 vote
  7. Comment on Day 5: Cafeteria in ~comp.advent_of_code

    hpr
    Link Parent
    Very similar. Even though I banged my head against a silly error in the range consolidation, I'm a little proud I have you very slightly beat on this one! Code ### PARSING ### def parse, do:...

    Very similar. Even though I banged my head against a silly error in the range consolidation, I'm a little proud I have you very slightly beat on this one!

    Code
      ### PARSING ###
    
      def parse, do: parse(AocElixir.read_input(5))
    
      def parse(input) do
        [fresh_lines, test_lines] =
          input
          |> String.split("\n\n")
          |> Enum.map(&String.split(&1, "\n"))
    
        fresh_ranges = fresh_lines |> parse_fresh() |> compact_ranges()
        test_ids = test_lines |> Enum.map(&String.to_integer/1)
    
        {fresh_ranges, test_ids}
      end
    
      defp contains?({from, to}, num), do: from <= num and num <= to
    
      defp map_ranges(range, :default), do: {[], range}
    
      defp map_ranges({new_from, new_to}, {current_from, current_to}) do
        if contains?({current_from, current_to}, new_from) do
          # don't add the element just yet, instead merge it into accumulator, we're gathering a bigger range
          {[], {current_from, max(new_to, current_to)}}
        else
          # stop gathering the current range and add it into the list, start gathering the new range as the accumulator to check merges for next elements
          {[{current_from, current_to}], {new_from, new_to}}
        end
      end
    
      defp merge_last_accumulator({results, last_acc}), do: results ++ [last_acc]
    
      def compact_ranges(ranges) do
        ranges
        |> Enum.group_by(fn {from, _to} -> from end, fn {_from, to} -> to end)
        |> Enum.map(fn {key, values} -> {key, Enum.max(values)} end)
        |> Enum.sort_by(fn {from, _to} -> from end)
        |> Enum.flat_map_reduce(:default, &map_ranges/2)
        |> merge_last_accumulator()
      end
    
      def parse_fresh(range_lines) do
        range_lines
        |> Enum.map(&String.split(&1, "-"))
        |> Enum.map(fn [from, to] -> {String.to_integer(from), String.to_integer(to)} end)
      end
    
      ### PART 1 ###
    
      def fresh?(fresh_ranges, num), do: Enum.any?(fresh_ranges, &contains?(&1, num))
    
      def part1({fresh_ranges, test_nums}), do: Enum.count(test_nums, &fresh?(fresh_ranges, &1))
    
      ### PART 2 ###
    
      defp num_ids({from, to}), do: to - from + 1
    
      def part2({fresh_ranges, _test_nums}), do: fresh_ranges |> Enum.sum_by(&num_ids(&1))
    
    Benchmarks
    Name               ips        average  deviation         median         99th %
    part two     1533.17 K        0.65 μs    ±70.29%        0.64 μs        0.76 μs
    parse           5.67 K      176.24 μs    ±11.47%      172.19 μs      236.19 μs
    part one        2.87 K      348.02 μs     ±3.81%      344.16 μs      386.45 μs
    

    Also, is it all just tuples, arrays, maps and such usually in Elixir?
    Coming from OOP, I feel a little... type-starved :D
    But it would've been a good idea to split out the Range-stuff as you did.

    2 votes
  8. Comment on Day 5: Cafeteria in ~comp.advent_of_code

    hpr
    Link Parent
    Another similar experience here. Also tried with a set, but I killed the process before anything bad could happen. Then I tried to be "clever" and thought it might somehow take less memory if I...

    Another similar experience here.
    Also tried with a set, but I killed the process before anything bad could happen.

    Then I tried to be "clever" and thought it might somehow take less memory if I packed it all into bit-mask where 1 indicated presence and 0 indicated non-presence. Turns out, the Erlang VM does not like arbitrarily large bit-shifts.

    ...and I also struggled with a very basic error because of a case that wasn't handled in the demonstration data, but was in my input.

    2 votes
  9. Comment on Day 4: Printing Department in ~comp.advent_of_code

    hpr
    Link Parent
    Huh, I was scratching my head a bit if there isn't a better / more efficient way to handle arrays in elixir, but also just opted for a map of coordinates. Minus some naming, fancy language...

    Huh, I was scratching my head a bit if there isn't a better / more efficient way to handle arrays in elixir, but also just opted for a map of coordinates. Minus some naming, fancy language features and your grid module, we landed on pretty much the same solution with very similar performance characteristics too.

    Name               ips        average  deviation         median         99th %
    part one         96.13       10.40 ms     ±5.26%       10.04 ms       11.51 ms
    part two          7.02      142.51 ms     ±2.42%      140.66 ms      147.03 ms
    

    (each part parses once, so that's included here)

    2 votes
  10. Comment on Day 2: Gift Shop in ~comp.advent_of_code

    hpr
    Link Parent
    When scanning your solution, there's definitely some standard-library stuff that I hadn't stumbled upon yet, that I maybe should be using. I was a little unhappy with the runtime, but async_stream...

    When scanning your solution, there's definitely some standard-library stuff that I hadn't stumbled upon yet, that I maybe should be using.
    I was a little unhappy with the runtime, but async_stream was surprisingly approachable and cut it down to ~2 seconds for both tasks (which still isn't exactly fast).

    I reckon doing the splitting on integers instead of strings might be the next obvious thing to improve, but I can't be bothered to do more now, time to sleep.

    Code
    defmodule DayTwo do
      @moduledoc """
      Day Two of AOC
      """
      require Integer
    
      def is_invalid_id_part_one(id) when is_number(id) do
        is_invalid_id_part_one(Integer.to_string(id))
      end
    
      def is_invalid_id_part_one(id) when is_binary(id) do
        length = String.length(id)
    
        if rem(length, 2) == 0 do
          is_invalid_id_with_repeat_length(id, 2)
        else
          false
        end
      end
    
    
      def is_invalid_id_part_two(id) when is_number(id) do
        is_invalid_id_part_two(Integer.to_string(id))
      end
    
      def is_invalid_id_part_two(id) when is_binary(id) do
        length = String.length(id)
    
        1..length
        |> Enum.map(fn len -> is_invalid_id_with_repeat_length(id, len) end)
        |> Enum.any?()
      end
    
      def is_invalid_id_with_repeat_length(id, len) do
        length = String.length(id)
    
        if rem(length, len) == 0 do
          [first_part | parts] = split_multiple(id, div(length, len))
          Enum.any?(parts) and Enum.all?(parts, &(first_part === &1))
        else
          false
        end
      end
    
      def split_multiple(str, len) do
        {left, right} = String.split_at(str, len)
    
        if String.length(right) < len do
          [left]
        else
          [left | split_multiple(right, len)]
        end
      end
    
      def parse_range(id_range) do
        [left, right] = String.split(id_range, "-")
        String.to_integer(left)..String.to_integer(right)
      end
    
      def get_invalid_ids(ids, is_invalid_id) do
        ids
        |> String.splitter([","], [:trim])
        |> Task.async_stream(fn range ->
          Enum.filter(parse_range(range), is_invalid_id)
        end)
        |> Enum.flat_map(fn result ->
          {_status, datalist} = result
          datalist
        end)
      end
    
      def sum_invalid_ids(ids, is_invalid_id) do
        ids
        |> get_invalid_ids(is_invalid_id)
        |> Enum.reduce(&(&1 + &2))
      end
    
      def solve_part_one do
        sum_invalid_ids(AocElixir.read_input(2), &is_invalid_id_part_one/1)
      end
    
      def solve_part_two do
        sum_invalid_ids(AocElixir.read_input(2), &is_invalid_id_part_two/1)
      end
    
      def main do
        IO.puts(~s"Part One: #{solve_part_one()}")
        IO.puts(~s"Part Two: #{solve_part_two()}")
      end
    end
    
    2 votes
  11. Comment on Day 1: Secret Entrance in ~comp.advent_of_code

    hpr
    Link Parent
    The step up in difficulty really messed me up. I spent way more time than I usually would being certain there was just a very simple formula for counting the number of times one would pass over 0...

    The step up in difficulty really messed me up.

    I spent way more time than I usually would being certain there was just a very simple formula for counting the number of times one would pass over 0 without special cases. It's day 1 after all!

    But alas, handling the special cases for the negatives seems to have been necessary all along.
    I could have saved myself a lot of time thinking I was missing the obvious.

    2 votes
  12. Comment on Day 1: Secret Entrance in ~comp.advent_of_code

    hpr
    Link Parent
    Huh, didn't think I'd find someone else here doing it in Elixir. I might be having a closer look at your code, since I am using the puzzles to learn the language.

    Huh, didn't think I'd find someone else here doing it in Elixir.
    I might be having a closer look at your code, since I am using the puzzles to learn the language.

    2 votes
  13. Comment on Why humanity needs a Lunar seed vault in ~space

    hpr
    Link
    I am all for having a backup. I do wonder if, in a catastrophe, we would still have the resources to spare to retrieve the seeds from the moon, though.

    I am all for having a backup. I do wonder if, in a catastrophe, we would still have the resources to spare to retrieve the seeds from the moon, though.

    29 votes
  14. Comment on What games have you been playing, and what's your opinion on them? in ~games

    hpr
    Link Parent
    Alright, that sucks. But as I said, I'm not surprised there was another bug there. War is certainly still interesting, though I am not a fan of early Naval warfare, since most countries simply...

    Alright, that sucks. But as I said, I'm not surprised there was another bug there.

    War is certainly still interesting, though I am not a fan of early Naval warfare, since most countries simply don't seem to get sailors easily until Banking.
    I'm also still relatively confused about all the subject-stuff, and coordination with vassals is pretty bad.

    I wasn't that huge on modding the previous games except Ironman-compatible stuff, so I'm mostly miffed that a bunch of simple, graphical mods don't seem to be Ironman-compatible.
    Also, it's pretty easy to accidentally not enable Ironman, I wish there was a default setting there.

    1 vote
  15. Comment on What games have you been playing, and what's your opinion on them? in ~games

    hpr
    Link Parent
    Great to hear! I have also been enjoying the game. I believe I read in some patch notes, that the Claim Throne CB wasn't actually completely broken, it was just that the option to claim the throne...

    Great to hear! I have also been enjoying the game.

    I believe I read in some patch notes, that the Claim Throne CB wasn't actually completely broken, it was just that the option to claim the throne would only show up after occupying the capital of the target.
    (Though I'd not be surprised to be corrected that there were multiple bugs there.)

    Trade does seem in a somewhat rough shape considering how important it is, and they seem to be aware of that. Amusingly there was a reddit post showing that trade losses were incorrectly netted as profits, which means the "ideal way to trade" was to lose as much money as possible. (Though I haven't confirmed that in my game.)

    I think it's a bit tedious to get CBs in general as a European, especially in the HRE but I also notice I don't feel compelled as much to constantly be at war because there is actual fun to be had outside war.

    I also agree that this might be the best release in a long time and also shows a ton of potential in general.

    1 vote
  16. Comment on Europa Universalis V review – even hardened grand strategy veterans may be startled by the intricacy of this historical simulation in ~games

    hpr
    Link Parent
    They did actually include the three of the most vital DLC into the base game last year: https://forum.paradoxplaza.com/forum/threads/free-dlcs-for-all-europa-universalis-iv-players.1701846/...

    They did actually include the three of the most vital DLC into the base game last year: https://forum.paradoxplaza.com/forum/threads/free-dlcs-for-all-europa-universalis-iv-players.1701846/

    Otherwise, there is this resource on reddit you could check out: https://www.reddit.com/r/eu4/comments/13feee3/135_dlc_tier_list_which_dlc_to_buy/

    4 votes
  17. Comment on Europa Universalis V review – even hardened grand strategy veterans may be startled by the intricacy of this historical simulation in ~games

    hpr
    Link
    While I'm pretty intrigued (I'm also in the almost 3000 hours camp for EUIV) and have followed almost all the development diaries through the years, I'm disappointed they explicitly have not...

    While I'm pretty intrigued (I'm also in the almost 3000 hours camp for EUIV) and have followed almost all the development diaries through the years, I'm disappointed they explicitly have not committed to any support for Linux or Mac, which recent games like CK3 and Victoria apparently did. I would have to hope for Proton to work here.

    With my other reservations about performance, apparent lack of polish, weak AI-competency and (this is very subjective, of course) uninspiring UI-style, I'm torn on whether to buy not. I guess I might wait for Linux-user reviews and also for them to iron out some bugs in the first months after release.

    ... which sucks, because the foundation of the game seems very exciting. The scope is epic, and I love how they went on communicating very openly, considered their community a resource and took risks by openly developing this as a magnum opus of sorts. This should be rewarded! I really hope they hit the mark and build smartly on that foundation for a decade to come. If they do, I'm at serious risk of even buying into their DLC policy again.

    3 votes
  18. Comment on Making DND maps in ~games.tabletop

    hpr
    Link
    For another alternative specifically for quick sketches without a lot of styling, I have used https://www.dungeonscrawl.com/, which is a mostly (for the basics) free web-tool.

    For another alternative specifically for quick sketches without a lot of styling, I have used https://www.dungeonscrawl.com/, which is a mostly (for the basics) free web-tool.

    2 votes
  19. Comment on Making DND maps in ~games.tabletop

    hpr
    Link Parent
    The UI of Wonderdraft is an acquired taste, but it's a great tool imo I'm not typically artistically inclined at all, but I'm pretty proud of some of my Wonderdraft maps. One of my players even...

    The UI of Wonderdraft is an acquired taste, but it's a great tool imo
    I'm not typically artistically inclined at all, but I'm pretty proud of some of my Wonderdraft maps.

    One of my players even printed one out and applied coffee grounds to the paper to give it a proper parchment-style.

    3 votes
  20. Comment on What is a business/org that is great and ethical in so many aspects that everyone should consider using? in ~life

    hpr
    Link Parent
    My girlfriend got AwesomeSocks through the year of 2023, so the oldest pair of socks should be 2 years and 8 months old, or something like that. I don't think she has had any holes yet, though I...

    Question: has anyone here bought their socks, and if so, can you offer any input on the durability? I walk largely barefoot at home, and also tend to walk/run hard, and have worn many holes in my socks. So I'm always on the lookout for good, somewhat thick socks that also aren't too thick.

    My girlfriend got AwesomeSocks through the year of 2023, so the oldest pair of socks should be 2 years and 8 months old, or something like that. I don't think she has had any holes yet, though I could be wrong. She certainly hasn't complained about low durability. I don't think she is as hard on her socks as you describe yourself to be.

    The only minor complaint here: When they're in the washer, they seem to attract an unusual amount of our cat's hair.