To be honest I was really a bit of panic about that recent days as we saw a lot of people who doesn’t know code at all starting to make complete games with AI…
and in the meanwhile, ironically, we(and unity) are still improving editor, solving bugs with vulkan, Android, HDRP, URP, UIToolkit, Editor fix things, and waiting xx sec for the code to compile, domain reloading…
I mean, what’s the point as we are still digging around these things, while the AI and the new trending rookie using AI coding is replacing us (and the workflow), I really cannot see the future.
This ends with someone saying to their device “i want an rpg with wolverine and (insert name of actor)”. It thinks for a minute and then they are off playing, watching or whatever. Itll be a lot less polished but access to immediate user customization.
It wont end just with replacing some coding. It also isnt about what exists today, its what the next generation growing up on AI tools will build over the following decades.
Show me a single commercially viable game coded with AI prompting by non-coders. It doesn’t exist. A cheap snake clone is not replacing us. And o1 is hardly better than GPT4. Current AI application has likely already peaked since they want to ask $2k/month for access to o1 proper. AGI is not possible under current LLM paradigm, at best it’s a single component of an unknown whole.
I’m not personally worried about it. Mainly for the following reasons:
It’s not currently financially viable. I think ChatGPT loses around $650,000 per day. A lot of the AI tools are cool but not many people aren’t willing to pay for it in its current state. It would be a different story if it rapidly improved but it seems like progress has slowed down a lot. I guess most AI projects are just running on investors.
It has a bad public opinion. I feel like this might decrease over time, I’m guessing no one will really care if AI is used in about 10 years.
It’s unpredictable and can cause legal issues. I think Assassin’s Creed Shadows was accused of plagiarizing Zoro’s sword. I feel like someone on the art team probably used an AI art tool for concepting and didn’t realize it generated Zoro’s sword.
It’s incohesive. I find AI art and code just feels random and doesn’t work well when combined into a larger project.
Also, other technologies haven’t replaced professionals e.g. motion capture hasn’t replaced animators, photogrammetry hasn’t replaced 3d artists, 3d art looks more realistic but we still have 2d artists.
I can’t see AI replacing experienced professionals. I am a bit worried about future juniors though. Even if the AI was only as good as a junior level then companies would start using AI to replace junior jobs. There wouldn’t really be an incentive to study if you can’t get your foot in the door. Then over time the seniors would retire and there would be no one to replace them.
Not really sure what the end goal is with AI. Let’s say that far in the future we have the perfect AI, personally I think having an AI that’s better than everyone at everything sounds like a nightmare.
An aspect about the AI stuff is that search is becoming worse, and AI is replacing it. And besides that a lot of help is found as much on discord as places like here, maybe more so. So, finding that information is getting harder.
A lot of people consider coding just googling a lot stuff, and AI makes that faster and easier. To me thats the main impact.
Even if it can code a whole game what’s the risk of that compared with buying a template on the asset store? You can use it as is and produce slop, or customize it for what you actually need.
At the end of the day actual effort will be more rewarded and be higher quality.
AI / generative tools won’t replace technical or programming professions anytime soon, if ever.
Not for anything complex.
At least not until private data become public. Which I don’t see ever to happen.
Simply there is not enough full quality applications which are publicly open source. That what is available, it is just handful of data to even consider training on it, fod anything serious.
Most engineering knowledge is kept behind the door. So there is nothing to train on.
Most quality games are close source. So not enough samples to train on.
And even if it was, that is few less samples, than images and text sources to even train on.
Adding variations and languages of different software, makes training even more difficult.
So besides snipets of codes and simple games, we won’t see anything serious anytime soon, in engineering nor programming fields.
I agree, but not with all.
The market already get’s flooded with AI stuff. And 98% is a case for the trash can.
The real issue I personally see is that the broad mass seem to be OK with that.
AI can’t compete with something manmade, not anytime soon. But it doesn’t have to.
Quality doesn’t seem the mean as much as it used to, by far. This isn’t a problem that started with the “AI Era”, AI just makes it a good amount worse.
Reality proves, that not since yesterday, you can throw sh** on the market and people still gladly pay for it.
This isn’t only about coding / games, it applies to the entire entertainment sector, games, movies, series and other stuff.
I see totally busted/messed up, low quality stuff ranking at the top. I see porn movies with actors/actresses that look like drug victims fresh from the jail. I see games suffering from gamebraking bugs, flaws and performance issues, including many trippleA titles. Dump antisocial nobrains are popular broadcasters.
And this isn’t the exception anymore, it’s the rule.
AI isn’t the cause of the “degenerating quality standart”, but it certainly puts a lot more oil into the fire.
Like I said, people still buy it. Obviously, they also buy crappy AI stuff. This again, encourages even more professional creators to stop or simply using AI by themselfs.
That is why I belive that the “threat” coming from AI is a reasoned. And I think sooner or later, content creators and coders in entertainment sector will become minors, and will mostly be busy with fixing the flaws of AI content up to a certain degree.
The most use tools like Copilot have been to me is helping convert a mathmatical principle into C# code appropriate for Unity; namely because I suck at maths.
Such as “write a method that can be used to determine if a point falls inside a triangle”, and it gives me a basic monobehaviour script. I can take said script, test it in isolation, and then apply the code into a more appropriate way that fits my needs.
Needless to say, the script, in isolation, provided by Copilot would be of no use to someone who doesn’t know how to code, and likely of little use to a complete beginner.
And of course, you ask it something of which there is next-to-none, or perhaps zero written literature on and it proves useless. Was trying to find out specifically how Terraria does it’s tile-based lighting. There doesn’t seem to be anything concrete written about it, so Copilot just gave me some very vague answers that I knew weren’t going to work. In the end I needed to work it out myself (long story short: flood fill algorithm).
Moral of the story is there’s a lot of intricate problem solving in game dev that can’t be replaced by AI, but merely supplemented in some cases, but not all.
A learned man came to me once.
He said, “I know the way, – come.”
And I was overjoyed at this.
Together we hastened.
Soon, too soon, were we
Where my eyes were useless,
And I knew not the ways of my feet.
I clung to the hand of my friend;
But at last he cried, “I am lost.”
-Stephen Crane
AI like ChatGPT doesn’t have any actual understanding of what it’s talking about.
If you never learn to code and just rely on AI for everything, you risk the danger of being lead down a dead-end path. If you don’t know what you’re doing then you won’t be able to fix it either.
All cutting edge technology is safe because models are trained on pre-existing code/documentation. The more code is out there, the more precise results LLM can generate. LLMs are also out of date always, because training one is very expensive at the moment.
So if something new released in the past few months, LLM is not trained on it. And even if it is trained on something new, the information available is so miniscule, that it likely won’t be able to generate results with any accuracy. LLMs are only good for standardized solutions with decades long histories and data available.
I have actually had decent help from chat GPT with DOTS. I just feed it a very low level example of how things work (systems, jobs, lookups, etc…) and then tell it what I want it to make. Sometimes it makes mistakes, but I just use it for structure and never expect finalized code. It’s also pretty good at upgrading existing ISystem to use Jobs once you show it how to.
AI is only as good as the person using it. You have to prime it for your use case, which requires knowledge and understanding, or it just spits out garbage.
In fact, many studios have long begun using AI in game design to optimize work processes, and, apparently, this is becoming a new norm in the industry(for example Game Art Outsourcing Studio https://ilogos.biz/game-art-production/). I understand your fears, especially when AI allows beginners to create entire games without knowledge of code. This seems to be a sharp contrast against the background of the fact that we have to solve routine problems, debug, expect compilation, and deal with the editor.
But on the other hand, AI is just another tool. He does not remove all the difficulties of development and does not replace deep knowledge and skills. Perhaps the future is precisely symbiosis: we can use AI to accelerate processes, freeing time for creative tasks, and not to solve small bugs.
In my opinion Copilot and similar services are most useful for students who don’t have a great tutor, and established programmers.
The people who just want to quickly make a game to boast or “get rich” will get frustrated just as quickly as before.
That does not mean that there isn’t an overall shift in efficiency since with good base knowledge it does indeed speed up your work.
Maybe. But I saw a new version of ChatGPT that managed to create code for a very simple game. It’s a very intriguing breakthrough and will have a significant impact on the future of game development. At the very least, it will make development easier and slightly lower the entry barrier into the field.
I wanted to post a bit about MS Copilot in particular and picked this thread to resurrect because it is more or less on topic. My earlier messages were pretty negative about AI assistance with coding but I’ve been using Copilot more often and I have to say the experience has been generally positive.
Just so we are clear I see no future where AI suddenly codes everything and (if so) why stop at coding why wouldn’t it just diagnose medical problems, do your bookkeeping, choose compatible mates, etc.? There are clearly limits imposed by choices… not everything is black and white and how best to code something depends upon situations that an AI system cannot reasonably know about.
Still 4 or 5 times now I have looked for answers to questions and found generally poor and incomplete answers so I asked Copilot. In the hands of a coder it is a very nice assistant as I can get a simple working example. Then refine it and then refine it more. In the end I can use that code as the basis for my solution using what I’ve learned as a guide.
“What I’ve learned” is an important part of the process as the iterative mechanism provides what humans could but rarely do in these support groups. It patiently showed me the steps from “it works” to “it works like I want”. Again humans could but you can see how many questions never get answered, answered incorrectly or answered rudely.
Copilot wasn’t rude and in the end I got a better understanding of the problem and solution. I can see this tech being quite useful for lots of things, not “everything” but some things. It is up to the individual to determine where and when to use the tool and when to accept the solution provided.
You aren’t required to implement every VS IntelliSense suggestion and you don’t have to use what an AI code generator produces.