News

We Built an AI-Powered Search App Over the Weekend. What’s Next?

Jim Benedetto

Jim Benedetto

Oct 31, 2024

Oct 31, 2024

We decided to spend the weekend seeing how quickly we could build a clone of Perplexity, the AI search engine. Then we open sourced it. Here's how.

Last week we released the public beta of Salt AI. The driving idea behind Salt, which lets users build, test, and deploy powerful workflows on an elastic GPU cloud, has always been to accelerate AI development. By that, we mean providing a development environment and open ecosystem that makes it easier and faster to go from idea, to proof-of-concept, to scalable, production-ready app. 

Our most active users, of course, are the members of our own team, who are always exploring the next exciting use case for the Salt platform. In that pursuit, we decided to spend the weekend seeing how quickly we could build a clone of Perplexity, the AI search engine. 

Like many others, we’re excited about Perplexity challenging Google in the search space by delivering a fantastic LLM-powered search experience. Perplexity recently raised $250M on an extremely impressive $3B valuation. We’re big fans of their product and agree that the best way to search in 2024 is through a conversational interface.

But it’s not just Perplexity — a revolution is underway when it comes to the rapid development of AI-powered applications that can fetch enormous valuations and deliver huge value to their customers. The faster that innovators can ideate, prototype, iterate, and collaborate, the more we will see highly specialized and targeted AI solutions – with applications in healthcare, finance, logistics, commerce, and beyond – come from anywhere. 

We believe these solutions will be built on Salt.

So we accepted our own challenge and set out to build our Perplexity clone over the weekend (spoiler alert…we did). Then we open sourced it to see what the community does next. Check out the video below to see how we did it.

Pretty neat, right? The best part is that you can run and iterate on this workflow here.

Curious to learn more about our process and how to work with LLM nodes on Salt? Here are some of the details:‍

  • We added nodes to use Groq’s API and many of LlamaIndex’s already powerful LLM interactions

  • We then wired them up together and experimented quickly in the Salt Workflow Editor, which let us observe the intermediate responses we were able to generate and share them back and forth with our team to make tweaks

  • When we got something that worked, we deployed it to an API with one click


Let us know what you think about this workflow, and what you plan to create next in the Salt AI Discord.