this post was submitted on 23 Feb 2024
126 points (92.6% liked)

Technology

59568 readers
3698 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
 

Tyler Perry Puts $800M Studio Expansion On Hold After Seeing OpenAI’s Sora: “Jobs Are Going to Be Lost”::Tyler Perry is raising the alarm about the impact of OpenAI's Sora on Hollywood.

you are viewing a single comment's thread
view the rest of the comments
[–] TheOneCurly@lemm.ee 35 points 9 months ago (9 children)

Sora can sometimes do 1 minute clips that mostly look ok as long as you don't pay too close attention. We are incredibly far away from coherent, feature-length narratives and even those aren't likely to be thematically interesting or engaging.

[–] kescusay@lemmy.world 32 points 9 months ago (5 children)

Yep. I watched their demo clips, and the "good" ones are full of errors, have lots of thematically incoherent content, and - this is the biggie - can't be fixed.

Say you're a 3D animator and build an animation with thousands of different assets and individual, alterable elements. Your editor comes to you and says, "This furry guy over here is looking in the wrong direction, he should be looking at the kangaroo king over there, but it looks like he's just glaring at his own hand."

So you just fix it. You go in, tweak the furry guy's animation, and now he's looking in the right direction.

Now say you made that animation with Sora. You have no manipulatable assets, just a set of generated frames that made the furry guy look in the wrong direction.

So you fire up Sora and try to fine-tune its instructions, and it generates a completely new animation that shares none of the elements of the previous one, and has all sorts of new, similarly unfixable errors.

If I use an AI assistant while coding, I can correct its coding errors. But you can't just "correct" frames of video it has created. If you try, you're looking at painstakingly hand-painting every frame where there's an error. You'll spend more time trying to fix an AI-generated animation that's 90% good and 10% wrong than you will just doing the animation with 3D assets from scratch.

[–] Buelldozer@lemmy.today 9 points 9 months ago* (last edited 9 months ago) (3 children)

Now say you made that animation with Sora. You have no manipulatable assets, just a set of generated frames that made the furry guy look in the wrong direction.

"Sora, regenerate $Scene153 with $Character looking at $OtherCharacter. Same Style."

Or "Sora, regenerate $Scene153 from time mark X to time mark Y with $Character looking at $OtherCharcter. Same Style".

It's a new model, you won't work with frames anymore you'll work with scenes and when the tools get a bit smarter you'll be working with scene layers.

"Sora, regenerate $Scene153 with $Character in Layer1 looking at $OtherCharacter in Layer2. Same Style, both layers."

I give it 36 months or less before that's the norm.

[–] Sprucie 6 points 9 months ago

I agree, I don't think people realise how early into this tech we are at the moment. There are going to be huge leaps over the next few years.

load more comments (2 replies)
load more comments (3 replies)
load more comments (6 replies)