lily33

joined 1 year ago
[–] lily33@lemm.ee 5 points 2 days ago

Technically, "enforced pay it forward" is called credit. Your debt would then be "the amount you still have to pay forward".

Of course, this defeats both the spirit and the purpose of a pay it forward scheme.

[–] lily33@lemm.ee 12 points 5 days ago* (last edited 5 days ago) (1 children)

I don't know - but I'm willing to get those instances where people were saved weren't calls from anonymous voip numbers.

[–] lily33@lemm.ee 2 points 1 week ago
[–] lily33@lemm.ee 17 points 1 week ago* (last edited 1 week ago) (6 children)

Indeed. Linux ~~audio~~ also allows control characters like backspace to be part of a file name (though it is harder to make such file as you can't just type the name). Which is just horrible.

[–] lily33@lemm.ee 4 points 1 week ago (1 children)

"Just works" is not a mentality imposed by Microsoft, and has nothing to do with loss of control. It's simply (a consequence of) the idea that things which can be automated, should be. It is about good defaults, not lack of options.

[–] lily33@lemm.ee 9 points 2 weeks ago

Boeing's next big solution could be a strike by 32,000 workers

Fixed.

[–] lily33@lemm.ee 53 points 3 weeks ago* (last edited 3 weeks ago) (1 children)

It's not an article about LLMs not using dialects. In fact, they have learned said dialects and will use them if asked.

What they did was, ask the LLM to suggest adjectives associated with sentences - and it would associate more aggressive or negative adjectives with African dialect.

Seems like not a bias by AI models themselves, rather a reflection of the source material.

All (racial) bias in AI models is actually a reflection of the training data, not of the modelling.

[–] lily33@lemm.ee 6 points 3 weeks ago (1 children)

It's certainly good, I'm not arguing that. My point is, if the wine team is interested, they can fork the unmaintained project, and work on that. Eventually, people will switch over to the active fork. What Microsoft is doing, is helping the process along, and making it easier. So it's good, and helpful - but not really a "donation" to winehq.

[–] lily33@lemm.ee 122 points 3 weeks ago* (last edited 3 weeks ago) (5 children)

I guess it's simply the framing: It was a not very actively maintained open source project. So they've decided to turn it over to a new maintainer. Calling that 'donation' is a bit pushing it

[–] lily33@lemm.ee 82 points 4 weeks ago* (last edited 4 weeks ago) (6 children)

I'm confused - why is Microsoft trying to - or expected to, by the article authors - patch a vulnerability in GRUB?

[–] lily33@lemm.ee 6 points 1 month ago (1 children)

It's good, but it is corporate.

[–] lily33@lemm.ee 29 points 1 month ago
 

This is a meta-question about the community - but seeing how many posts here are made by L4sBot, I think it's important to know how it chooses the articles to post.

I've tried to find information about it, but I couldn't find much.

 

I'm not a lawyer, but my understanding of a license is that it gives me permission to use/distribute something that's otherwise legally protected. For instance, software code is protected by copyright, and FOSS licenses give me the right to distribute it under some conditions.

However, LLMs are produced by a computer, and aren't covered by copyright. So I was hoping someone who has better understanding of law to answer some questions for me:

  1. Is there some legal framework that protects AI models, so that I'd need a license to distribute them? How about using them, since many licenses do restrict use as well.

  2. If the answer to the above is no: By mentioning, following and normalizing LLM licenses, are we essentially helping establish the principle that we do need permission from companies to use their models, and that they have the right to restrict us?

view more: next ›