I'm Paying for a Search Engine. Here's Why.

I'm now paying for Kagi Search ... and ditching Perplexity, at least for now.

I'm Paying for a Search Engine. Here's Why.
A screenshot of the Kagi Search homepage.

This is something I thought I would never do. Something that, before I did, would've seemed ridiculous to me. I started paying for a search engine. Recently, I've begun to use (or try out) a search engine named Kagi, which is $10/month. I know it seems a bit crazy, I mean, paying for a search engine? Why would anyone do that when you can just use Google or DuckDuckGo or Bing or even Yahoo for completely free? Free as in zero, nothing, no money spent (and not even necessarily paying in your data, either). Well, I decided to pay for Kagi anyway. Let me explain...

Before this, I had heard of Kagi from a couple of places, specifically through the Orion web browser for Mac. Orion is a cool little web browser for MacOS that works a lot like Safari but with more quality-of-life features. Since Orion was developed by Kagi, I knew they were a niche paid search engine. I think I may have tried a couple of searches with them, but nothing really caught my attention, especially not enough to pay for a search engine. So I went on with my life as normal until another product caught my eye. 

It was called Perplexity, and it was an AI-powered search/answer engine. (The idea behind an "answer engine" is that, as opposed to standard search engines, "answer engines" answer your question right away, and don't just provide search results). I found this idea very interesting, especially since I tried a couple of searches and it seemed to work quite well. I started by using Perplexity for the occasional question I had, then began testing out the "Pro Search" mode for doing research, then finally slowly used Perplexity more and more alongside my search engine, until finally, I thought, why not just make it my search engine? After all, it also gave you a list of "sources" that worked similar enough to a traditional search engine for when you needed to do that sort of search.

As I used Perplexity more, I started to think more about its Pro plan, a $20/month subscription. $20 a month was certainly a lot, but this tool was providing me with a lot of value, and near-infinite "Perplexity Pro" queries seemed enticing. I eventually convinced myself to get it by comparing Perplexity Pro to other AI services. I told myself that Perplexity offered multiple Large Language Models (which you can think of as different versions of ChatGPT from different companies), while other AI products, like OpenAI's ChatGPT Plus, charge the same amount for one LLM in a wrapper that definitely weren't as good as Perplexity's. Now, all of a sudden, I was paying for a search engine, although an untraditional one. I told myself I was paying for an AI search engine, with the emphasis on that it was the AI that I was paying for, but over time it made me start to feel like paying for a search engine wasn't so crazy after all. 

All was good for a while, and Perplexity and I were going steady with most of my web queries going through them. I liked just having the information there, plain and simple, while still being able to dig deeper if I wanted to. It felt like it was the best of both worlds (as Hannah Montana would say). But, over time, I found myself using Perplexity's "sources" section more for traditional web results, but then not being satisfied with the links it provided, and ending up having to go to a traditional search engine anyway. For lots of things, I found out I just wanted normal, clean, traditional search results, especially when it came to finding information and news on recent events, or to find a very specific website or webpage (which happened more often than you'd expect). I started exploring how to use search engines alongside Perplexity, like trying to figure out whether Perplexity or my traditional search engine should be the default option when typing in my browser's toolbar. I also had to figure out what sorts of queries were best for Perplexity vs the ones that were best for traditional search. But even still, I just couldn't give up Perplexity. Something just couldn't beat those AI overviews when I just needed a quick answer or a way to point me in the right direction of where to look.

So, thank you 404 Media, because a 404 Media article about Kagi search is what got me to look at it again, this time more willing to pay for a search engine (since I already had been paying for a version of one for a while). I started with a free trial or test account of just 100 searches, and what I found was a really nice search experience. The searches were quick, the results were good (with me finding that there were very few SEO articles in my results), and all the listicles were bundled together. (It's actually so great, why don't more search engines do that?) I could also have Kagi rank certain websites higher or lower in search results based on my preferences. But the real killer feature that had me rethinking where my money for search each month should go was Kagi's "quick answers". Kagi's quick answers are Perplexity-like summaries of results that only show up when you tell them to. You can do this in 3 ways: if your search ends with a question mark, if you press the "quick answer" button after you do a search, or if you press the "q" key on your keyboard after searching.

This was genius! While all these companies working on search and AI have been trying to figure out how to detect user intent of searches and whether or not to show AI summaries or just plain web results, this tiny niche web company Kagi cracked the code. Just let the user, or more accurately the person, tell you what they want. If I want AI results, I either type a question mark or press "q". If I don't, I don't.

Kagi is also cheaper than Perplexity at half the price, and I don't really need all of Perplexity's Large Language Models, so I've decided to give Kagi a shot more long term. I've canceled my Perplexity subscription this month and signed up for Kagi. So far, I'm feeling pretty good about it. But we'll see how it holds up...