ChatGPT Does not Change my Thoughts On Low/No Code.
The world is headed for a SERIOUS "Citizen Developer" crisis. Brought on in no small part thanks to Low/No Code and things like ChatGPT.
ChatGPT is AMAZING. But, it does NOT eliminate the need for developers to be the ones writing software.
Sure, you can use these sorts of tools and a person with little to no prior coding experience could produce a functioning application or game. But, this comes with SEVERAL problems.
The biggest of which is unchanged from my prior assessment; these tools cannot maintain code and "Citizen Developers" are even less equipped for the task. So, the biggest problem becomes; what happens when V1 of such a product is found to be buggy or needs to evolve?
The other inherent problem is, how do you make money? Today, there are SOME people using these tools to create apps and games. But, it is still a fairly low number. As people start adopting these platforms more and more, it will reach the point where enough people are using them that your once potential customers can use them to the same ends.
Put another way; we are not "a few years" away from the golden age of AI built applications. We are right in the middle of it. And if you aren't using these tools profitably today, then you're already running the risk that the bubble will have burst by the time you get something to market.
Small, bespoke solutions fall under this category. But, this was never a huge business area anyway.
I generally see a few areas where software is still "immune" to things like ChatGPT;
- Applications of sufficient size or complexity that an AI can neither produce the result in a single shot nor consistently perform at that volume. Note: While this limitation may be solvable over time, today we are not there for most sizable applications.
- Support. One of the things large corporations are willing to pay for is support. And to be willing to pay that they generally want evidence of competency. If your software is produced by AI but you have no developers sufficiently intelligent to interpret the code, then you cannot supply any reasonable guarantee of support.
- Innovation. Especially if you have a patent. AI is good at building what you ask of it if there is sufficient prior data to help it build that thing. But, there will be new ideas and concepts which it cannot keep up with.
And I think that the last SIGNIFICANT hurdle companies will face if they try and sell ChatGPT based solution is; lawsuits! I mentioned patents above. These models have already proven themselves time and again to be pretty bad about plagiarizing things. And some of these things will be covered by patents. As such, even if the AI is smart enough to detect the License a piece of software was written under, it may still not be able to determine if it violates any patents.
I also look forward to the first infringement suit between two software companies who both had an AI generating model produce all or most of the code for both applications try and sue each other. It should be interesting. My money is no one winning.
This is actually the one area I'm little shocked is not getting much attention though. Especially given how much the art/photography scene is giving to their issues in this regard. Lawsuits aren't exactly rare in the world of software development. And not knowing enough about software licenses and patents not only increases the odds that you'll accidentally run afoul. It also means you're less equipped to work your way out.
A lot of times a company can escape being sued or walk away more or less unscathed if they can remedy the infringement in a timely and acceptable fashion. But, if you can't even understand what you're accused of in the first place, how then do you proceed?
I foresee a lot of companies rapidly starting and shutting down in the near future due to unforeseen lawsuits.
Anyway, not meant as a "doom and gloom" type of thing. Mostly looking at it as a very tempting thing. Tempting enough that many will walk into it without understanding the risks.
Comments
Post a Comment