AI this, AI that - you can't go anywhere without something trying to force AI on you. Usually a company trying to get you to buy into what they've wasted billions on. So indie devs have begun fighting back with their No Gen AI Seal.
As an artist who is learning to code its different.
It is night and day wether you have access to undo and HSV adjust but still must nail color, composition, values, proportion, perspective etc. Especially when a ton of shortcuts are also available to trad artists who can just paint over a projection. Only thing besides saving tons of money and making it easier to do your daily practise, digital art will also give you is more noob traps like brushes and then the lack of confidence from the reliance on undo and other tools like that. I transferred to traditional oil paints just fine cause the fundamentals are the one that separates the trash from the okay and above.
It is night and day when you ask ai how to make a multiplication table vs apply what you have learned previously to learn the logic behind making it yourself. Using AI wrong in programming means you don’t learn the fundamentals aka you don’t learn to program.
Comparing using AI to learn to program with learning to paint on ipad is wrong.
Comparing using AI to learn to program with using AI to make art for you is more apt.
None of your examples are even close to a comparison with AI which steals from people to generate approximate nonsense while costing massive amounts of electricity.
The rapid growth of AI and the investments into the underlying AI infrastructure have significantly intensified the power demands of data centers. Globally, data centers consumed an estimated 240–340 TWh of electricity in 2022—approximately 1% to 1.3% of global electricity use, according to the International Energy Agency (IEA). In the early 2010s, data center energy footprints grew at a relatively moderate pace, thanks to efficiency gains and the shift toward hyperscale facilities, which are more efficient than smaller server rooms.
That stable growth pattern has given way to explosive demand. The IEA projects that global data center electricity consumption could double between 2022 and 2026. Similarly, IDC forecasts that surging AI workloads will drive a massive increase in data center capacity and power usage, with global electricity consumption from data centers projected to double to 857 TWh between 2023 and 2028. Purpose-built AI nfrastructure is at the core of this growth, with IDC estimating that AI data center capacity will expand at a 40.5% CAGR through 2027.
Lets just say we’re at the halfway point and its 600 TWh per anum compared to 285 for gamers.
So more than fucking double, yeah.
And to reiterate, people generate thousands of frames in a session of gaming, vs a handful of images or maybe some emails in a session of AI.
If you learned to code with AI then you didnt learn to code.
If you learned math with a calculator you didn’t learn math.
Same vibes as “if you learned to draw with an iPad then you didn’t actually learn to draw”.
Or in my case, I’m old enough to remember “computer art isn’t real animation/art” and also the criticism assist Photoshop.
And there’s plenty of people who criticized Andy Warhol too before then.
Go back in history and you can read about criticisms of using typewriters over hand writing as well.
As an artist who is learning to code its different. It is night and day wether you have access to undo and HSV adjust but still must nail color, composition, values, proportion, perspective etc. Especially when a ton of shortcuts are also available to trad artists who can just paint over a projection. Only thing besides saving tons of money and making it easier to do your daily practise, digital art will also give you is more noob traps like brushes and then the lack of confidence from the reliance on undo and other tools like that. I transferred to traditional oil paints just fine cause the fundamentals are the one that separates the trash from the okay and above.
It is night and day when you ask ai how to make a multiplication table vs apply what you have learned previously to learn the logic behind making it yourself. Using AI wrong in programming means you don’t learn the fundamentals aka you don’t learn to program. Comparing using AI to learn to program with learning to paint on ipad is wrong. Comparing using AI to learn to program with using AI to make art for you is more apt.
None of your examples are even close to a comparison with AI which steals from people to generate approximate nonsense while costing massive amounts of electricity.
Have you ever looked at the file size of something like Stable Diffusion?
Considering the data it’s trained on, do you think it’s;
A) 3 Petabytes B) 500 Terabytes C) 900 Gigabytes D) 100 Gigabytes
Second, what’s the electrical cost of generating a single image using Flux vs 3 minutes of Balder’s Gate, or similar on max settings?
Surely you must have some idea on these numbers and aren’t just parroting things you don’t understand.
What a fucking curveball joke of a question, you take a nearly impossible to quantify comparison and ask if its equivalent?
Gaming:
AI:
Lets just say we’re at the halfway point and its 600 TWh per anum compared to 285 for gamers.
So more than fucking double, yeah.
And to reiterate, people generate thousands of frames in a session of gaming, vs a handful of images or maybe some emails in a session of AI.