I've been using Neeva for a couple months now after Kagi's recent pricing update and their metered tiers. This announcement came out of the blue, but I guess the writing was on the wall when they...
I've been using Neeva for a couple months now after Kagi's recent pricing update and their metered tiers. This announcement came out of the blue, but I guess the writing was on the wall when they removed existing features (spaces, personalized searches, etc.) earlier this year in an effort to pivot to LLMs to make it sustainable.
I’m still using Kagi since I’m relatively happy with the early adopter plan I was grandfathered into, and I’m extremely happy with the search quality. However, I’ve started using bangs a lot more...
I’m still using Kagi since I’m relatively happy with the early adopter plan I was grandfathered into, and I’m extremely happy with the search quality. However, I’ve started using bangs a lot more so I don’t exceed my search quota. I’d love to find a (paid) alternative if there is one, but I haven’t found any.
Search is expensive to run, I guess. Makes you wonder how much money Google makes from ads on search, or if it’s just a loss leader.
I haven't noticed any significant improvements to search recently - it's been good enough for me as long as I've been using it. I have used their new AI summarizing tools a couple of times, but I...
I haven't noticed any significant improvements to search recently - it's been good enough for me as long as I've been using it. I have used their new AI summarizing tools a couple of times, but I always forget it's there. Depending on how heavy of a search user you are / how comfortable you are with using bangs, the newer plans might still not be worth it, to be completely honest.
For me, my biggest gripe with other search engines is how easy it is for SEO spam website listicle articles to clog up results. Kagi has been the only search engine thus far that I've found that does a really good job of filtering those out.
I've been tempted by Kagi lately. Currently using a self-hosted SearXNG instance as a 'search proxy' essentially that can pull from multiple engines. That's nice for putting a layer between me and...
I've been tempted by Kagi lately. Currently using a self-hosted SearXNG instance as a 'search proxy' essentially that can pull from multiple engines. That's nice for putting a layer between me and search providers but it's only 'okay' at results. I think I'll at least try the Kagi trial.
I don't love the idea of having yet another subscription nickel-and-diming me but search is important, and it seems Kagi has done a lot to provide quality results, be pretty transparent about privacy and how the search operates, and in general help 'find the signal in the noise'- or at least that's what I've heard from others.
Curious how you have your self-hosted searxng set up. Is it on a vps? And can it also pull from kagi? I'm wondering if there's a way to refine results further if you can pull from multiple search...
Curious how you have your self-hosted searxng set up. Is it on a vps? And can it also pull from kagi? I'm wondering if there's a way to refine results further if you can pull from multiple search engines...
I'm not the original commenter but I also self-host a SearXNG instance. Mine's running on a VPS, and I put it behind a webauthn proxy so it's not publicly accessible. Kagi isn't currently one of...
I'm not the original commenter but I also self-host a SearXNG instance. Mine's running on a VPS, and I put it behind a webauthn proxy so it's not publicly accessible. Kagi isn't currently one of the supported engines but there's an open issue to add it so it might be supported at some point in the future, assuming Kagi has an API that you can use with an account there.
The main customization point in SearXNG is you can pick which combination of the supported engines to source results from. However, since a lot of it involves scraping, some engines don't work consistently (DuckDuckGo in particular seems to actively work against scrapers and is constantly breaking). It does sort of have something like Kagi's "lenses," but it's a pre-baked list (web, images, news, videos, stuff like that) and each one has a specific list of supported engines that you can enable or disable. There's nothing similar to Kagi's ability to block or boost specific domains in results, you just get what your selected engines return without any ability to really refine beyond that.
Thanks for elaborating. Wouldn't having to scrape from multiple search engines + having a vps setup in a specific region add to the latency? If it's not too bad, I'd be interested in exploring...
Thanks for elaborating. Wouldn't having to scrape from multiple search engines + having a vps setup in a specific region add to the latency? If it's not too bad, I'd be interested in exploring that option!
Latency hasn't really been a problem for me. I haven't actually timed it but I don't perceive searches as taking any longer than second or two at most to show me results, pulling data from a...
Latency hasn't really been a problem for me. I haven't actually timed it but I don't perceive searches as taking any longer than second or two at most to show me results, pulling data from a couple main engines (google and bing) plus a few instant answer sources (wikipedia, wikidata, dictzone).
That is really interesting. I'm also running a personal SearXNG instance and have pretty bad latency, somewhere around ~5 seconds on average. It was bad enough that I had to increase the timeout...
That is really interesting. I'm also running a personal SearXNG instance and have pretty bad latency, somewhere around ~5 seconds on average. It was bad enough that I had to increase the timeout cap. I guess that's just my poor upload speed and several layers of indirection then?
I wouldn't guess that upload speed would have that big an effect on latency (I don't imagine much data is being sent to the servers per query), but I suppose it's possible. Depending on what you...
I wouldn't guess that upload speed would have that big an effect on latency (I don't imagine much data is being sent to the servers per query), but I suppose it's possible. Depending on what you mean by indirection that sounds like the more likely culprit to me. Also it's a Python app so not the most efficient, running on a slower machine like a Raspberry Pi could also be a speed bottleneck.
I don't know if it can pull from Kagi- I don't think it's an option by default but you can probably add custom engines- I haven't tinkered too heavily with it. I have a tiny Unraid server running...
I don't know if it can pull from Kagi- I don't think it's an option by default but you can probably add custom engines- I haven't tinkered too heavily with it.
I have a tiny Unraid server running a few self-hosted things, and Unraid makes docker containers super easy/automated to install and use, so I installed SearXNG through that, and then set my browser to use it as my search engine as well as homepage, so I hit the local instance and search there.
It's pretty fast though sometimes can run afoul of certain search engine APIs (too many requests, etc) but not too often. I know it supports quite a few things out of the box- Startpage, Brave, Google, Bing, Qwant, etc. - and you can also specify different engines for image search, customize a number of other things too. It's pretty neat, but I've only skimmed the surface.
Only issue I've found is that, unlike Google, Bing, Kagi, etc- you miss out on instant-result stuff like tracking number recognition, etc.
I also don't truly know how private it is. Presumably since it's querying Google instead of me logged into an account in a browser- it's less associated with me.. but I guess because I host it inside my house- all of the queries to those search engines are still coming from my IP... but I didn't expect super anonymous search or anything, just liked having a layer between me and google, and also having multiple engines underneath. If I wanted additional privacy, running the whole thing over VPN would probably be best (or VPS + VPN maybe?)- or even just using something like Mullvad's search engine that's available when using their VPN
I just liked the idea of hosting my own search proxy/engine and having that level of control vs. using existing engines directly.
I've been using Neeva for a couple months now after Kagi's recent pricing update and their metered tiers. This announcement came out of the blue, but I guess the writing was on the wall when they removed existing features (spaces, personalized searches, etc.) earlier this year in an effort to pivot to LLMs to make it sustainable.
I’m still using Kagi since I’m relatively happy with the early adopter plan I was grandfathered into, and I’m extremely happy with the search quality. However, I’ve started using bangs a lot more so I don’t exceed my search quota. I’d love to find a (paid) alternative if there is one, but I haven’t found any.
Search is expensive to run, I guess. Makes you wonder how much money Google makes from ads on search, or if it’s just a loss leader.
Google makes almost 60% of their revenue from search ads, it's easily one of the largest single revenue sources anywhere.
I'm curious if you've noticed any improvements to Kagi's search after their recent updates addressing that. I just might return to Kagi!
I haven't noticed any significant improvements to search recently - it's been good enough for me as long as I've been using it. I have used their new AI summarizing tools a couple of times, but I always forget it's there. Depending on how heavy of a search user you are / how comfortable you are with using bangs, the newer plans might still not be worth it, to be completely honest.
For me, my biggest gripe with other search engines is how easy it is for SEO spam website listicle articles to clog up results. Kagi has been the only search engine thus far that I've found that does a really good job of filtering those out.
Same here - I'm still on Legacy Professional (bought a year to stave off the quotas for now) but the search quality has far surpassed Google's.
This is actually the first time ever I'm hearing about them.
No wonder it didn't bode well for them :)
Exactly lol, spend some monies on marketing too.
I've been tempted by Kagi lately. Currently using a self-hosted SearXNG instance as a 'search proxy' essentially that can pull from multiple engines. That's nice for putting a layer between me and search providers but it's only 'okay' at results. I think I'll at least try the Kagi trial.
I don't love the idea of having yet another subscription nickel-and-diming me but search is important, and it seems Kagi has done a lot to provide quality results, be pretty transparent about privacy and how the search operates, and in general help 'find the signal in the noise'- or at least that's what I've heard from others.
Curious how you have your self-hosted searxng set up. Is it on a vps? And can it also pull from kagi? I'm wondering if there's a way to refine results further if you can pull from multiple search engines...
I'm not the original commenter but I also self-host a SearXNG instance. Mine's running on a VPS, and I put it behind a webauthn proxy so it's not publicly accessible. Kagi isn't currently one of the supported engines but there's an open issue to add it so it might be supported at some point in the future, assuming Kagi has an API that you can use with an account there.
The main customization point in SearXNG is you can pick which combination of the supported engines to source results from. However, since a lot of it involves scraping, some engines don't work consistently (DuckDuckGo in particular seems to actively work against scrapers and is constantly breaking). It does sort of have something like Kagi's "lenses," but it's a pre-baked list (web, images, news, videos, stuff like that) and each one has a specific list of supported engines that you can enable or disable. There's nothing similar to Kagi's ability to block or boost specific domains in results, you just get what your selected engines return without any ability to really refine beyond that.
Thanks for elaborating. Wouldn't having to scrape from multiple search engines + having a vps setup in a specific region add to the latency? If it's not too bad, I'd be interested in exploring that option!
Latency hasn't really been a problem for me. I haven't actually timed it but I don't perceive searches as taking any longer than second or two at most to show me results, pulling data from a couple main engines (google and bing) plus a few instant answer sources (wikipedia, wikidata, dictzone).
That is really interesting. I'm also running a personal SearXNG instance and have pretty bad latency, somewhere around ~5 seconds on average. It was bad enough that I had to increase the timeout cap. I guess that's just my poor upload speed and several layers of indirection then?
I wouldn't guess that upload speed would have that big an effect on latency (I don't imagine much data is being sent to the servers per query), but I suppose it's possible. Depending on what you mean by indirection that sounds like the more likely culprit to me. Also it's a Python app so not the most efficient, running on a slower machine like a Raspberry Pi could also be a speed bottleneck.
I don't know if it can pull from Kagi- I don't think it's an option by default but you can probably add custom engines- I haven't tinkered too heavily with it.
I have a tiny Unraid server running a few self-hosted things, and Unraid makes docker containers super easy/automated to install and use, so I installed SearXNG through that, and then set my browser to use it as my search engine as well as homepage, so I hit the local instance and search there.
It's pretty fast though sometimes can run afoul of certain search engine APIs (too many requests, etc) but not too often. I know it supports quite a few things out of the box- Startpage, Brave, Google, Bing, Qwant, etc. - and you can also specify different engines for image search, customize a number of other things too. It's pretty neat, but I've only skimmed the surface.
Only issue I've found is that, unlike Google, Bing, Kagi, etc- you miss out on instant-result stuff like tracking number recognition, etc.
I also don't truly know how private it is. Presumably since it's querying Google instead of me logged into an account in a browser- it's less associated with me.. but I guess because I host it inside my house- all of the queries to those search engines are still coming from my IP... but I didn't expect super anonymous search or anything, just liked having a layer between me and google, and also having multiple engines underneath. If I wanted additional privacy, running the whole thing over VPN would probably be best (or VPS + VPN maybe?)- or even just using something like Mullvad's search engine that's available when using their VPN
I just liked the idea of hosting my own search proxy/engine and having that level of control vs. using existing engines directly.