From the language-to-code capabilities of OpenAI Codex to the code assistance features of GitHub’s Copilot, large-language models (LLMs) and generative AI are rapidly changing how engineering teams work — and transforming what productivity means.
Whether it’s navigating technical documentation or automating QA/QC workflows, engineering firms like Blankfactor are now unlocking the potential of generative AI for software development. That means both seamless solutions to help engineers thrive and significant efficiency gains for client partners.
Since the explosion of generative AI (GenAI) capabilities over one year ago, we’ve been busy developing our own LLM-powered prototypes for clients and internal teams. The result? BlankGPT, a custom engineering co-pilot and internal productivity assistant that creatively integrates both the OpenAI LLM and Blankfactor’s own private model. In this article, CTO Lucho Hristov shares more about how BlankGPT was developed, how it will bring value to Blankfactor engineers and client partners, and how it will speed up the AI-powered transformation of software development.
BlankGPT: a custom AI assistant and intelligent internal search
Internal LLMs and generative AI tools can help teams with a number of use cases. After several months of exploration and prototyping, we’ve established two core products for our staff and engineering teams: a custom engineering co-pilot and LLM-powered internal search.
The co-pilot will enable several core functions, including:
- Faster project onboarding with technical documentation analysis and discovery.
- Personalized coding assistance.
- Team data collection and knowledge sharing.
Developing a personalized AI sidekick for engineers
The BlankGPT development team’s core vision was to create an AI-enhanced sidekick that accumulates project knowledge over time, helping engineers streamline development tasks and work more efficiently.
“What we want to produce is to fine-tune a model per person,” Hristov said. “So we feed the model with every question that you’ve asked the model so far and the responses. And then this is your sidekick, essentially.”
As engineers interact with BlankGPT across the lifetime of a project and throughout the software development lifecycle, asking questions and accessing documentation, the AI co-pilot remembers these interactions.
This is an innovative generative AI tool. While ChatGPT provides responses to queries in a contained conversation, its responses are not generated based on the broader context of the user’s previous queries. With BlankGPT, engineers can get increasingly refined, more valuable results and accelerate workflows as the model learns from their common searches and builds contextual knowledge over time.
Onboarding efficiencies — AI-powered technical documentation review
One of the biggest and most time-consuming challenges engineers face when onboarding with new projects is understanding the project context through technical documentation.
“With some clients, there’s a huge amount of documentation – it takes lifetimes to read,” said Hristov. “It’s very difficult for engineers to find the right documentation. If you navigate them to just catch the thread, they can figure it out, but it’s like searching for a needle in a haystack.”
With BlankGPT, developers can access technical documentation through AI-powered search, extracting summaries and retrieving data with near-instant results — tasks that would otherwise require hours of manual research. Additionally, as team members join long-term projects, they can quickly access contextual information and query project data.
Faster onboarding frees up engineers to focus on more advanced aspects of problem-solving in client projects, delivering high-quality results faster.
Powering internal documentation and search with AI
The internal uses for BlankGPT go beyond software development; it will be used to deliver efficiencies and enhance digital experiences across teams. Other use cases include:
- Converting and standardizing documents. For instance, the tool utilizes LLMs to extract unstructured data from any CV, convert it into structured information, and build a new CV in a predefined format.
- HR database querying and summarization. Our global teams can ask BlankGPT to access company policies and HR documents, such as information on time off or reimbursement policies by region, for example.
Developing BlankGPT: LLM selection, hosting, and fine-tuning
The selection of the underlying LLM was a crucial decision for BlankGPT. With the proliferation of open-source, AI-powered developer tools, the market dominance of GPT-4, and the expansion of OpenAI’s developer APIs (TechCrunch), our team considered both the benefits and the risk potential of implementing public models in the context of private company data.
“There was one prominent AI evangelist who was saying that if you want to develop something internally because you have this great idea, your best option is to wait because OpenAI will just come out with it shortly,” says Hristov. But when OpenAI did develop a new implementation, it wasn’t the right solution — primarily due to data protection concerns.
BlankGPT offers built-in flexibility to interact with either OpenAI or a local model, ensuring that sensitive data is identified, filtered, and queries are directed to the appropriate model to mitigate data exposure risk. Further, the BlankGPT development team is currently working to host our own OpenAI-powered model in Azure — a strategic shift ensuring greater control over data flow and usage.
As technologies evolve, we’re constantly re-evaluating solutions that will ensure data privacy protections and deliver the tools necessary for our teams. We intend to continually explore new implementations that will optimize both of those priority areas.
RAG versus fine-tuning: costs and benefits
Blankfactor built BlankGPT using a combination of fine-tuning and retrieval augmented generation (RAG) techniques. Fine-tuning creates more accurate, personalized models by training the AI on an engineer’s unique questions and responses. RAG relies on searching across large bodies of documentation to find relevant answers in variably-sized chunks.
Striking the right balance between accuracy and scalability was one key learning, Hristov explained. When working with RAG techniques, making information chunks too small improves precision but limits context, while longer excerpts enable generalization at the cost of quality.
Ultimately, the team found that the cost efficiencies of RAG don’t outweigh the benefits of fine-tuning in an enterprise AI application.
Bringing BlankGPT to client partners
In financial services and other regulated industries, companies are naturally concerned about data protection while exploring LLM-powered solutions.
As we begin using BlankGPT on client projects, partners can choose to protect their company data by:
- Hosting all documentation in their own self-contained environment.
- Deploying Blankfactor’s infrastructure for the AI model in our environment and ingesting a client’s data securely.
Empowering teams to make the most of generative AI in software development
Along the way, Hristov and the BlankGPT development team discovered a crucial element of success in using developer tools like BlankGPT: the way users engage with the model.
“There are big differences between how people engage with the AI,” said Hristov. While some engineers have a natural aptitude for conversation with LLMs, others need resources to understand best practices and how to see the best results.
For instance, when users get unhelpful responses, they may abandon the conversation — but to train the model and see productivity gains, it’s crucial to finish the conversation. Users may also use improper syntax, leading to misunderstanding.
To address these challenges and boost our teams’ confidence with BlankGPT, we’re producing documentation on how to effectively use LLMs. With effective use of generative AI for software development, we aim to help engineers focus on creative responses to higher-order problems, arming our teams to better meet complex development challenges.
From pilot to production: launching BlankGPT in 2024
The BlankGPT pilot launched this January across Blankfactor’s own engineering teams. So far, results have been promising both for productivity and employee perception.
“Some people might feel that the AI replaces their brain, and they’re afraid of it,” said Hristov. “For some people (myself included), I feel like I have superpowers with it. I feel I’m becoming an expert in areas where it’s really helpful to become faster and smarter.”
With internal testing ramping up over 2023, the next phase is client partnerships. Blankfactor hopes the tool can demonstrate value by both accelerating delivery timelines and enhancing engineer satisfaction.
“We’ve already started demonstrating tools to prospective clients,” noted Hristov. ” We’re hopeful that we can engage in a production-grade initiative with clients relatively shortly.”
The team’s vision is to embrace the future of AI-augmented software, where teams can focus their creativity on higher-order problems and better meet the challenges of software development’s rising complexity.
Work with GenAI experts to unlock your teams’ productivity
At Blankfactor, we help industry giants transform and change with the latest technologies. From the latest in GenAI solutions to the most ambitious legacy modernization challenges, our experts in data engineering, AI, and financial technology know how to drive impact in your business. We bring an adaptive approach and deep technical expertise to every project – and with BlankGPT, we’re bringing that delivery excellence one step further.
Let’s discuss how your organization can improve operational resilience and efficiencies through next-generation AI and automation. Whether you’re looking to optimize development efficiencies, enhance customer experience in call center operations, or optimize QA/QC through automation, we can help.