Using NLP to Create Developer Tools: Q&A with Michael Turck from Forefront.ai

Avatar of David BanysDavid Banys
·

Michael is the Founder of Forefront.ai, a company that builds natural language processing (NLP) tools for developers.

We were excited to talk with Michael about some of the work he’s done with large language models (LLM). In particular we wanted to know how lessons his team learned from building a hosted inference engine for NLP resulted in a new product for developers who write documentation.

Let’s jump in!

Forefront.ai, which previously built a hosted NLP inference API, is now building tools to help developers write tests and documentation

Forefront.ai, which previously built a hosted NLP inference API, is now building tools to help developers write tests and documentation

Railway: Can you give us a little introduction into what Forefront is doing and what kind of tools you’re making?

Turck: Forefront traditionally was a platform for deploying and finetuning large language models. All these models like GPT-3 that are really good at doing language-based tasks are not the easiest to use. So we made a 1-click deploy to make using them easier.

Railway: So if we wanted to do inference on GPT-3, we could use your service to do that and add data and parameters?

Turck: Yeah, pretty much. GPT-3 isn’t open source though so you’d have to use a similar open source model like GPT-J or NeoX. They’re all different and excel at different things. But yeah you can do a 1-click deploy to spin-up one of these models and then you can inference it, feed it whatever input you want, and own your model.

Railway: So we’ve heard that you’ve just launched an early release of a new product, one that is more of an end-user product for developers rather than ML engineers. Can you tell us about the new product?

Turck: Basically we wanted to move into a more verticalized solution around automated testing.

It’s a VSCode extension where if you have functions or classes in your IDE then as you’re writing code you get a little action button that pops up that can generate unit tests and documentation.

When you click the button, documentation appears above the function, or tests appear in a new test file in your IDE. If you like it you can do nothing and keep it, or else you can modify it. And then you just commit it along with your code.

It kinda handles the context shift of going from writing code to thinking about what test cases the code should have – and then it just does all the work for you. Test writing is a bit tedious and most people aren’t super interested in it so we’re building something that handles it for you and does it well.

Railway: Since your team had been so focused on making NLP models accessible, it sounds like this is a tool that developed naturally from your subject matter expertise. Is that fair?

Turck: Yeah exactly. I think a lot of tools we’ve built have stemmed from things we wanted ourselves.

One of the lessons we learned from running a hosting platform is that these language models are pretty new and people don’t really know how to use them well. Even people who are good at them are always discovering new ways to use them. And people who don’t know how to use them aren’t getting effective outcomes with these models.

Since we built Forefront the hosted NLP service, we got really good at knowing how to get a good outcome. This new product is for coding, which is a useful and nontrivial test case, and now we can target people who are not just interested in using LLMs as an API but literally any developer who writes tests.

Forefront is available as an open beta for developers who want to make writing tests and documentation easier with the help of NLP

Forefront is available as an open beta for developers who want to make writing tests and documentation easier with the help of NLP

Railway: What’s on the public roadmap for the next few months? Are you working toward a release?

Turck: Right now it’s a lot of customer discovery and transitioning to early beta testers. Whoever wants to use it they can make an account and try it out. We’re trying to figure out what’s valuable, what can be improved, what to support next, stuff like that.

Railway: What are the early returns so far? Does it save a ton of time writing tests and documentation?

Turck: Yeah definitely. It’s a pretty magical experience, it just saves a lot of time.

But it’s kind of like when the language models first came out – it’s like, here’s this magical thing and if it does what it says it does then it’s super valuable.

And so what we have now works pretty good for contained use cases like functions that don’t have a lot of dependencies or something and we’re working with early users to expand use cases and applications.

Railway: Where does Railway fit into the application stack? Can you describe a little bit how Railway comes into play?

Turck: We’ve been using Railway since we started the company. We used Railway to spin-up Redis for caching. We also used Railway as a database in Temporal, which we’ve used to run billing, templates, and a number of critical things that are used a lot in production.

Railway was just the easiest way to spin up. Databases are kind of annoying to set up anywhere else and Railway is just literally one click, copy the credentials, good to go. So the convenience of Railway is great.

With this new product, we have a few applications that are similar. We just wanted somewhere easy to just go and deploy. With Railway you connect the github repo, set up the branch, get a domain, done. Good to go.

Railway: If people want to find out more about what Forefront where should they go?

Turck: [new website coming soon] beta.forefront.ai