11 votes

Detailed astronomical observation logging with a microphone and AI

I'm into astronomy as a hobby, and it's useful to have a log of the things in the sky you've seen. Many people take handwritten notes, but my handwriting is awful and having to write notes takes away from enjoying the views through the telescope.

So after being inspired by AI note taking applications I've read about online, I decided to give it a try while out observing.

First, I take my recordings and run them through Whisper's speech to text engine. Because my recordings have frequent astronomical terms, it seems to be more useful to run the larger models (or the new turbo model), because they seem to do better identifying the names of moons and constellations.

Next, I run Ollama and give it a prompt like:

Here is a speech to text transcript of an observation session. Create a detailed log of the session, using only facts found in the text. Do not add your own interpretation to the text.

I'm still experimenting with prompts and models, but oddly it seems like shorter prompts give better results. So far, the type of model doesn't seem to matter so much as the number of parameters. Gemma2 27B outperforms llama 3.1 7B, but llama 70B outperforms both of them (again, I think the larger models have more context about which astronomical terms are important). Here's what that gets me with basically no work on my part:

**Observation Session Log**

**Session Details**

* Date: October 3rd, 2024
* Location: Wolf Street
* Telescope: Teleview 85 (also referred to as 80)
* Observer: [Not specified]

**Targets Observed**

1. M29 (Open Cluster in Cygnus)
	* Time: 2057 - 2101
	* Eyepiece used: 18.2mm
	* Notes:
		+ Found using star hopping from Sater
		+ Appears as a faint open cluster with 6-7 distinct stars
		+ Stars are fairly even in brightness and color
2. M39 (Open Cluster)
	* Time: 2109 - 2116
	* Eyepiece used: Ultra-wide angle, 18.2mm, and 8.5mm (noted as a mistake, actually
using 7mm)
	* Notes:
		+ Found using star hopping from Deneb
		+ Appears as a larger open cluster with more differentiation in star magnitudes
		+ Stars appear bluer than surrounding stars
3. Alberio (Double Star)
	* Time: 2120, 2155
	* Eyepiece used: Ultra-wide angle and 7mm
	* Notes:
		+ Found easily without a guide
		+ Appears as a beautiful contrast between colors with constituent parts standing out from the surrounding stars
4. M15 (Globular Cluster in Pegasus)
	* Time: 2120 - 2130
	* Eyepiece used: Ultra-wide angle, 18.2mm, 7mm, and Barlow (2x) with 7mm
	* Notes:
		+ Found using a bright naked-eye star as a guide
		+ Appears as a slight smudge or fuzzy patch with averted vision
		+ Not resolving individual stars at any power
5. Saturn
	* Time: 2140 - 2150
	* Eyepiece used: Low power, highest power (with a star chart to confirm moon positions)
	* Notes:
		+ Three moons visible: Rhea, Enceladus, and Titan
		+ Striping on the surface of Saturn visible at highest power

**Session End**

* Time: 2157

I'm very happy with the quality of the notes. It's much, much better than my handwritten notes and much less work, so I'm likely to do this more consistently.

5 comments

  1. sparksbet
    Link
    Summaries are probably one of the areas where Gen AI is the most reliable (with the right prompt engineering, of course) and combining that with speech to text for this use case is a really clever...

    Summaries are probably one of the areas where Gen AI is the most reliable (with the right prompt engineering, of course) and combining that with speech to text for this use case is a really clever solution to your problem! Kudos.

    1 vote
  2. [4]
    carsonc
    Link
    Do you run this locally? What's your hardware setup?

    Do you run this locally? What's your hardware setup?

    1. [3]
      blitz
      Link Parent
      Yeah! Everything runs locally. I've got an M3 Max Macbook Pro with 128 GB of RAM, but these models don't take very much RAM at all. I tried to download and run the Llama 3.1 405B and 128 GB of RAM...

      Yeah! Everything runs locally. I've got an M3 Max Macbook Pro with 128 GB of RAM, but these models don't take very much RAM at all. I tried to download and run the Llama 3.1 405B and 128 GB of RAM wasn't enough!

      1 vote
      1. [2]
        carsonc
        Link Parent
        I have a lot of respect for what you are accomplishing here. Have you considered a different, astronomy focused LLM? There are several models listed here on github. There are some good...

        I have a lot of respect for what you are accomplishing here. Have you considered a different, astronomy focused LLM?

        There are several models listed here on github. There are some good astronomy-focused LLMs that might be useful.

        I'm curious to know how these would compete against the much larger models you are using right now.

        1. blitz
          Link Parent
          Thanks for this suggestion! I have only ever used models available from ollama, I've never tried domain specific ones. I'll look into it!

          Thanks for this suggestion! I have only ever used models available from ollama, I've never tried domain specific ones. I'll look into it!