Credit: VentureBeat made with Microsoft Copilot
Join our daily and weekly newsletters for the latest updates and exclusive content on industry-leading AI coverage. Learn More
On the heels of releasing its new generative AI models, Google updated its Code Assist tools to work with Gemini 2.0 and expanded the external data sources it connects to.
Code Assist will now run on the recently released Gemini 2.0, offering a larger context window to understand bigger code bases from enterprises.
Google will also launch Gemini Code Assist tools in a private preview. The platform will connect to data sources like GitLab, GitHub, Google Docs, Sentry.io, Atlassian and Snyk. This will allow developers and other coders to ask Code Assist for help directly in their IDEs. Previously, Code Assist connected to VS Code and JetBrains.
Google Cloud senior director for product management Ryan J. Salva told VentureBeat in an interview that the idea is to allow coders to add more context to their work without interrupting their flow. Salva said Google will add more partners in the future.
Formerly Duet AI, Code Assist was launched for enterprises in October. As organizations sought ways to streamline coding projects, demand for AI coding platforms like GitHub Copilot grew. Code Assist added enterprise-grade security and legal indemnification when the enterprise option was released.
AI where developers work
Salva said connecting Code Assist to other tools developers use provides more context for their work without them having to simultaneously open multiple windows.
“There’s so many other tools that a developer uses in the course of a day,” Salva said. “They might use GitHub or Atlassian Jira or DataDog or Snyk or all these other tools. What we wanted to do is to enable developers to bring in that additional context to their IDE.”
Salva said developers just need to open the Code Assist chat window and ask it to summarize the most recent comments for particular issues or the most recent pull requests on repositories, “so that it queries the data source and brings the context back to the IDE and [the] large language model can synthesize it.”
AI code assistants were some of the first significant use cases for generative AI, especially after software developers began using ChatGPT to help with coding. Since then, a slew of enterprise-focused coding assistants have been released. GitHub released Copilot Enterprise in February, and Oracle launched its Java and SQL coding assistant. Harness came out with a coding assistant built with Gemini that gives real-time suggestions.
Meanwhile, OpenAI and Anthropic began offering interface features that let coders work directly on their chat platforms. ChatGPT’s Canvas lets users generate and edit code without copying and pasting it elsewhere. OpenAI also added integrations to tools like VS Code, XCode, Terminal and iTerm 2 from the ChatGPT MacOS desktop app. Meanwhile, Anthropic launched Artifacts for Claude so Claude users can generate, edit and run code.
Not Jules
Salva pointed out that while Code Assist now supports Gemini 2.0, it remains wholly separate from Jules, the coding tool Google announced during the launch of the new Gemini models.
“Jules is really one of the many experiments to emerge out of the Google Labs team to show how we can use autonomous or semiautonomous agents to automate the process of coding,” Salva said. “You can expect that over time, the experiments that graduate from Google Labs, those same capabilities, might become a part of products like Gemini Code Assist.”
He added that his team works closely with the Jules team and is excited to see Jules progress, but Code Assist remains the only generally available enterprise-grade coding tool powered by Gemini.
Salva said early feedback from Code Assist and Jules users shows great interest in Gemini 2.0’s latency improvements.
“When you’re sitting there trying to code and trying to stay in the flow state, you want those kinds of responses to come up in milliseconds. Any moment the developer feels like they’re waiting for the tool is a bad thing, and so we’re getting faster and faster responses out of it,” he said.
Coding assistants will still be crucial to the growth of the generative AI space, but Salva said the next few years may see a change in how companies develop code generation models and applications.
Salva pointed to the 2024 Accelerate State of DevOps Report from Google’s DevOps Research and Assessment team, which showed 39% of respondents distrusted AI-generated code and a decline in documentation and delivery quality.
“We have as an industry with AI assistive tools focused largely on throughput productivity improvements and velocity improvements over the course of the last four years,” Salva said. “And as we’re starting to see that that be associated with a drop in overall stability, I suspect here that the conversation in the next year is really going to shift to how are we using AI to improve quality across multiple dimensions.”
Daily insights on business use cases with VB Daily
If you want to impress your boss, VB Daily has you covered. We give you the inside scoop on what companies are doing with generative AI, from regulatory shifts to practical deployments, so you can share insights for maximum ROI.
Read our Privacy Policy
Thanks for subscribing. Check out more VB newsletters here.
An error occured.