this post was submitted on 14 Jul 2023
4 points (75.0% liked)
Videos
14302 readers
102 users here now
For sharing interesting videos from around the Web!
Rules
- Videos only
- Follow the global Mastodon.World rules and the Lemmy.World TOS while posting and commenting.
- Don't be a jerk
- No advertising
- No political videos, post those to !politicalvideos@lemmy.world instead.
- Avoid clickbait titles. (Tip: Use dearrow)
- Link directly to the video source and not for example an embedded video in an article or tracked sharing link.
- Duplicate posts may be removed
Note: bans may apply to both !videos@lemmy.world and !politicalvideos@lemmy.world
founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
Considering the amount of bad uses you can give to said images, what kind of safeguards can we put in place to stop it from being abuse.
Preferably ones that aren't invasive to our privacy. After all the better the images get the less we will be able to believe in, unless we see wit our own eyes.
Can you give an example of what you mean by "bad uses"?
Of course, let's say in a while you cannot trust a video of someone committing homicide, simply because it could be fake.
The opposite is also truth, you can shatter the public's trust in an individual, without him actually saying or doing any of the things he's accused off
OK Thanks - So you are asking about protections from misinformation- deepfakes and such.
As technology improves, it may be downright impossible to tell real from fake with our own eyes- at which point what is "proof" becomes blurry. It will become "this is why we can't have nice things..." where innocents are at risk of harm (non-AI art getting rejected from competitions because it looks like AI art), and bad actors more often get away with shenanigans. Hopefully we're smart enough to figure out ways to avoid that kind of future.
However, I don't think restricting the technology itself, through legislation or otherwise, would be practical nor would it be very effective. Forgery and deception are age-old concepts, and people aren't going to stop trying to cheat/lie/steal. Some people (VFX artists?) can probably already make a believable fake homicide. And just look at all the fake UFO footage out there- we don't really need AI to deceive people, it's just that AI makes it more accessible- and perhaps now within reach for some lowlife that needs to cheat to be successful in life. And, most countries already have laws in place against fraud, forgery, and libel- things that hurt others. It would be very difficult to regulate "misinformation" though, because it can overlap with legitimate uses such as art and entertainment.
Of course, it would be nice to have only "Ethical" AI - and this is what you are starting to see in the commercial space, but it is pretty easy to bypass these restrictions (not endorsing this, just an example of a quick search result). Also, not all AI systems will even bother trying to be ethical, and once the technology is more accessible bad actors could just make their own AI systems from scratch. I also think any attempt at restriction through legal means would significantly hinder legitimate research in the field and slow progress on what may be our best chance at overcoming humanity's biggest challenge (climate change, etc.).
I like to think of AI as an extension of the human intellectual tool set - so let's not treat it like guns or drugs (physical things) but rather like libraries or the internet. Regulated to a practical extent, yes, but not really restricted with regards to what it can do. The fact that the internet was not highly regulated or highly-controlled during it's inception is a major part of why it is the amazing global network we have today.
Ok, you make a great point. I just don't know that comparing it to a liberal wolf be wise, the gun metaphor is correct. It's not like we need to make any real effort to use these aí, we just need a couple of prompts. Even the slowest of us could use it, a library involves research and at least hard work to collect it all.
Yes I agree with you in the restrictions, we would just end up hiding information in order to enforce these laws, do it would be worse in the long therm. But I mean the danger sir this technology are infinite, should everyone have access to it, or just a few people who had to go under training and have licences. .
Like I said I feel like this is going to be a Far west type of cenario, where those who are good with guns are free to do as they want.
What would be better America's approach on guns or the European one?
Realistically, zero. Stable Diffusion is open source for better or worse and that means people can tailor it for their needs.