this post was submitted on 29 Aug 2023
155 points (100.0% liked)

Technology

37747 readers
665 users here now

A nice place to discuss rumors, happenings, innovations, and challenges in the technology sphere. We also welcome discussions on the intersections of technology and society. If it’s technological news or discussion of technology, it probably belongs here.

Remember the overriding ethos on Beehaw: Be(e) Nice. Each user you encounter here is a person, and should be treated with kindness (even if they’re wrong, or use a Linux distro you don’t like). Personal attacks will not be tolerated.

Subcommunities on Beehaw:


This community's icon was made by Aaron Schneider, under the CC-BY-NC-SA 4.0 license.

founded 2 years ago
MODERATORS
 

There is a machine learning bubble, but the technology is here to stay. Once the bubble pops, the world will be changed by machine learning. But it will probably be crappier, not better.

What will happen to AI is boring old capitalism. Its staying power will come in the form of replacing competent, expensive humans with crappy, cheap robots.

AI is defined by aggressive capitalism. The hype bubble has been engineered by investors and capitalists dumping money into it, and the returns they expect on that investment are going to come out of your pocket. The singularity is not coming, but the most realistic promises of AI are going to make the world worse. The AI revolution is here, and I don’t really like it.

you are viewing a single comment's thread
view the rest of the comments
[–] FaceDeer@kbin.social 8 points 1 year ago (1 children)

You just said the same thing the comment responding to did, though. He pointed out that AI can replace the lower 80%, and you said the AI can write some code but that it might have trouble doing the expert work of proving the code meets the safety criteria. That's where the 20% comes in.

Also, it becomes easier to recognize the possibility for AI contribution when you widen your view to consider all the work required for critical application development beyond just the particular task of writing code. The company surrounding that task has a lot of non-coding work that gets done that is also amenable to AI replacement.

[–] PenguinTD@lemmy.ca 4 points 1 year ago (1 children)

That split won't work cause the top 20% would not like to do their day job clean up AI codes. It's much better time investment wise for them to write their own template generation tool so the 80% can write the key part of their task, than taking AI templates that may or may not be wrong and then hunting all over the place to remove bugs.

[–] jarfil@beehaw.org 6 points 1 year ago* (last edited 1 year ago) (1 children)

Use the AI to fix the bugs.

A couple months ago, I tried it on ChatGPT: I had never ever written or seen a single line in COBOL... so I asked ChatGPT to write me a program to print the first 10 elements of the Fibonacci series. I copy+pasted it into a COBOL web emulator... and it failed, with some errors. Copy+pasted the errors back to ChatGPT, asked it to fix them, and at the second or third iteration, the program was working as intended.

If an AI were to run with enough context to keep all the requirements for a module, then iterate with input from a test suite, all one would need to write would be the requirements. Use the AI to also write the tests for each requirement, maybe make a library of them, and the core development loop could be reduced to ticking boxes for the requirements you wanted for each module... but maybe an AI could do that too?

Weird times are coming. 😐

[–] FaceDeer@kbin.social 6 points 1 year ago (1 children)

I'm a professional programmer and this is how I use ChatGPT. Instead of asking it "give me a script to do big complicated task" and then laughing at it when it fails, I tell it "give me a script to do ." Then when I confirm that works, I say "okay, now add a function that takes the output of the first function and does " Repeat until done, correcting it when it makes mistakes. You still need to know how to spot problems but it's way faster than writing it myself, even if I don't have to go rummaging through API documentation and whatnot.

[–] amki@feddit.de 1 points 1 year ago (2 children)

I mean that is exactly what programming is except you type to an AI and have it type the script. What is that good for?

Could have just typed the script in the first place.

It ChatGPT can use the API it can't be too complex otherwise you are in for a surprise once you find out what ChatGPT didn't care about (caching, usage limits, pricing, usage contracts)

[–] abhibeckert@beehaw.org 7 points 1 year ago* (last edited 1 year ago) (1 children)

Could have just typed the script in the first place.

Sure - but ChatGPT can type faster than me. And for simple tasks, CoPilot is even faster.

Also - it doesn't just speed up typing, it also speeds up basics like "what did bob name that function?"

[–] FaceDeer@kbin.social 3 points 1 year ago

And stuff like "I know there's a library out there that does the thing I'm trying to do, what's it named and how do I call it?"

I haven't been using ChatGPT for the "meat" of my programming, but there are so many things that little one-off scrappy Python scripts make so much easier in my line of work.

[–] FaceDeer@kbin.social 3 points 1 year ago

it's way faster than writing it myself

I already explained.

I could write the scripts myself, sure. But can I write the scripts in a matter of minutes? Even with a bit of debugging time thrown in, and the time it takes to describe the problem to ChatGPT, it's not even close. And those descriptions of the problem make for good documentation to boot.