Looking at Claude Code, it’s becoming easier to build applications quickly. While there are numerous, hard-to-predict implications, there will clearly be an increase in the number of applications built. It is going to be a lot faster to be able to clone or copy someone else’s approach, meaning that the technical advantage of having an application is probably much more short lived than it was in the past. Clearly solving a problem and customer understanding become paramount.
A focus on customer understanding would tend to push away from companies acquiring general purpose software products, and back towards making it easier for them to have their custom applications. One might think that the Palantir approach of a product accompanied by forward deployed engineers that can customize a base level of software to a customer’s specific needs would be effective towards achieving these goals. Indeed, Palantir itself is considering the possibility of forward deployed AI to serve this purpose.
On the lighter end, for those who don’t need to construct their own ontologies, there would be increased need to make it easy to deploy all of these customized applications. Players like Salesforce’s Heroku perhaps made an error in restricting their free tier a few years too soon. Ruby on Rails has the added bonus of a well known and easy to understand structure for applications, so it is a good target framework for code generation. There is a lot of opportunity to provide standardized hosting for the products of novice AI assisted programmers, with the added bonus that their inefficient code will provide a lot of CPU cycle billing opportunities. Perhaps Fly.io or Vercel will step into this space, but many large orgs are aiming in this direction, whether Figma from the design side or one of the AI model vendors from the generation side. Vercel seems to have adopted this as their strategy, but are locked into a purist single page application ecosystem due to technology fashion chasing that will ultimately constrain it.
The other factor is the increasing capabilities of large general purpose models to not just build the applications, but serve critical functions in their domains. Reasoning models that can handle program flow questions, and tool usage so that they can reach out and do the deterministic things (like counting) that computers used be good at. As these capabilities grow, the question of “do we need a product for that or can we just send it to ChatGPT/Claude/Gemini/Llama/Mistral/Deepseek?” will continue to put pressure on companies that don’t have a compelling workflow or approach that isn’t easily duplicated.
The broader economic point though is that there is going to be a lot more software, that software will be built more quickly, and it will be able to rely on large models to provide novel capabilities. The baseline economic response to an increase in supply without concomitant increase in demand would be for the price to fall. Absent other factors, software will become cheaper. From an investment standpoint, the software is now less interesting than it used to be. We now want to see what relationship the company has with the customers, how well they solve the problem, and what unique dataset or insight do they have that will allow them to maintain a competitive moat. It can’t just be the code.
0 Comments
Leave A Comment