AI Chat Search with Mendable
Mendable.ai is a SaaS product for developers to integrate AI chat search into their applications. The most common use case is bringing chat to docs, but Mendable is also a good fit for customer support, internal training, and product copilots.
We sat down with Mendable.ai Co-founder and CTO Nicolas Camara to learn more about how his team is building this exciting product.
Mendable.ai homepage
Railway: What can you tell us about Mendable?
Nicolas: Mendable started as an AI chat search for developer-focused companies. We’ve now evolved into a ‘chat with your data’ ecosystem where we provide a complete suite of tools such as API, React components, analytics, and enterprise-tailored solutions around smart search.
Right now we’re getting some serious workloads. Last month we had over 400K messages queried on our platform and it’s growing pretty fast. And we don’t have to worry too much about scaling services, Railway handles most of the work.
Railway: That’s a lot of queries! Where are you finding all these users?
Nicolas: We have a pretty good top of funnel. We’re integrated with a lots of companies and open source projects. We have a little “Powered by Mendable” button in our component so when people use our product around the web they can see where it comes from. We also get some attention on Twitter.
Railway: What’s the next stage for growth?
Nicolas: Right now we’re trying to focus on the developer side of things. We’re building out the API. We’re also focused on getting the enterprise offering together. The key for enterprise is patience as well as addressing privacy concerns — so that’s top of mind for us.
Mendable makes it easy to bring chat search into documentation
Railway: What can you tell us about your tech stack?
Nicolas: We run most of our stack on Railway. The first time I tried it it was super easy to get started and get my server running. So I was like “Ok, cool, this is awesome.” So I set up a staging environment, started testing with a little bit more traffic, and just really liked it because I didn’t have to worry too much about stuff. You all handle the scale, the pricing looked pretty good for us, so I was like you know what let’s just move our stuff to Railway for now and see how it goes and that’s what we did and honestly it’s going great. We have a couple services in there and we really like the way that we don’t have to worry too much about it.
We have Javascript servers that handles most of the workload for completions, embeddings, etc. And we also have a Python one which is solely focused on data ingestion. And then we have other staging pipelines there that we just use to make sure things are working fine. We also use Redis in Railway for the background tasks so we can keep track of the data workload and where everything is going.
It’s possible to define which LLM to use, create a custom prompt, and define the level of creativity in responses within the Mendable console
Railway: What kind of challenges come with growing so fast? What’s next to tackle?
Nicolas: The biggest challenge right now is addressing rate limiting and overloaded models. We’re constantly hitting 429s with the OpenAI API and addressing those has been quite a challenge as it doesn’t depend just on us. We’re also exploring some open source LLMs to improve the service without depending too much on a single vendor. We’ll see how it goes.
Railway: Where can readers learn more about Mendable and check out the product?
Nicolas: Check out our site mendable.ai, and also look for us on Twitter @mendableai.