Welcome to Incremental Social! Learn more about this project here!
Check out lemmyverse to find more communities to join from here!

General_Effort

@General_Effort@lemmy.world

This profile is from a federated server and may be incomplete. Browse more on the original instance.

General_Effort ,

It's just as easy. I was surprised to learn that they hired a voice actress. I guess hiring voice actors is cheaper than risking having to explain technology to a jury.

General_Effort ,

Oh yes, let's make a private company adjudicate the law. That'll teach em.

General_Effort , (edited )

Just because a claim doesn't stand in court, doesn't make it fraudulent. Actual fraudulent claims have landed people in prison.

ETA: Once again, I have no idea why I am being down-voted. The copyright fanatics here are really something else.

General_Effort ,

Well, what is the point?

General_Effort ,

I hereby grant approval for anybody to change, alter, and or use my comment for AI and commercial means.

I'm guessing this is what gets you down-voted. The "information wants to be owned" brigades are out in full force today.

General_Effort ,

Not really. They have to do something, or they become liable. If youtube decides that something is fair use, and a court disagrees, then they are on the hook for damages. They'd have to pay a lot of money to copyright lawyers, only for the chance of having to pay damages.

And, you know.., The same libertarians, who are now attacking youtube for not going full feudal, would be absolutely outraged if they did fight for fair use. It's stealing property, as far as they are concerned.

General_Effort ,
General_Effort ,

Most false claims are not fraudulent. The burden of proof is where US law puts it.

Thanks for explaining how people see this.

General_Effort ,

I’m not even sure what you’re arguing for since you seemed to have done a complete 180 on your stance. You earlier said you don’t want YouTube adjudicating the law (by choosing sides in a copyright claim), but now you’re arguing that they have to do this in order to avoid liability.
I see the problem.

EG Young people may not buy alcohol. When a cashier asks for ID, they are not adjudicating the law but following it. Right?

When you personally copy something, you must follow the law. EG When you re-upload some image for use on Lemmy, you must "judge" if you can legally do so. Maybe it's fair use, but that's not as straight as age. When you make the call, that does not mean that you adjudicate the law.

Under US law, someone can send a DMCA notice to the server. If the server owner ignores the take-down request, then they become liable to pay damages for the copyright infringement. Maybe the owner decided that it was a case of fair use, but that does not mean they adjudicate the law.

I hope that helped.


The issue here is copyright trolls claiming copyright over things that don’t belong to them.

That is criminal fraud. A copyright troll usually means someone on the legal side.

Currently, these cases are reviewed by bots,

That is wrong. But thank you for helping me understand the problems of the people here.

General_Effort ,

I understand the insanity. They want a private company to prosecute "fraud". Yikes. Less Ayn Rand and more civics lessons, please.

General_Effort ,

Because there is no easy way to ban in a democracy. Originally, the term means someone who hangs around in the lobby of congress (or such like) and talks to representatives when they come through. Imagine this is just some ordinary voter who has an important issue on their minds; perhaps someone like Raphael Lemkin. He did that. Non-profit organizations - like Greenpeace - lobby, as well. It's hard forbidding lobbying without unintended side effects.

Even if you did, it might not get you where you want. Representatives would still have an open ear for major employers in their districts. After all, voters want those jobs. Representatives meet those bosses on many occasions, like charity events. Money and power can be used to get more money and power.

Personal access is only a part of it, anyway. People influence the media and fund political ads. There's also funding for think tanks and universities. People with money and power (or fame) can do more of that.

Don't assume this something that just happens behind closed doors out of the public eye. For example, you may have noticed the recent kerfuffle between actress Scarlet Johansson and OpenAI. OAI allegedly hired a voice actress that sounded too similar to ScarJo. This community here seems to have largely sided with ScarJo. Which means that they want famous people to receive a rent for lending out their voices; a rent which will be ultimately paid by consumers. And if you have a similar voice? Tough.

This is exactly something that many of these AI lobbyists are paid to achieve. They are supposed to get money for the rich people who pay them; preferably without the rich people having to do work.

General_Effort ,

I think it would be unconstitutional in the US in light of citizens united. I'm sure that there are many things that could be done, but no simple answers like just banning lobbying.

General_Effort ,

That's a bit of a fine point, but yes. They want famous people to have the power to demand a rent, other concessions, or to refuse a deal entirely. So it's about more than just rent. It's the same power that landlords have, but eventually it's all about the money. If you equate it to stealing, then it's about the money, no?

General_Effort ,

Haywire. You don't see that often anymore.

General_Effort ,

I doubt this has to do with "powerful people". A DDOS attack does not remove anything from the net, but only makes it temporarily hard to reach.

There are firms that specialize in suppressing information on the net. They use SEO tricks to get sites down-ranked, as well as (potentially fraudulent) copyright and GDPR request.

There must be any number of "little guys" who hate the Internet Archive. They scrape copyrighted stuff and personal data "without consent" and even disregard robots.txt. Lemmy is full of people who think that people should go to jail for that sort of thing.

General_Effort ,

I don't think so, reading their terms.

General_Effort ,
General_Effort ,

There are some carve-outs for FOSS, but. The biggest problem is that the copyright lobby got in a body blow. It won't be enough to make the IP fanatics of lemmy happy, but it'll make some people money.

Model makers must have a policy in place to ensure compliance with the EU copyright directive. That means that websites can use a machine-readable opt-out from AI training. I think this is a big reason why we're now hearing about these deals with reddit and other copyright holders.

Also, model makers must create a summary of the copyrighted training data used. The AI office, which is supposed to enforce this act, is to provide a template for a sufficiently detailed summary. A lot will depend on the AI office and the courts.

A couple problems are obvious. Many enthusiasts will not bother with the paperwork. What will that mean for Hugging Face? Or the AI Horde? The EU is certainly the wrong place to build a business around hosting AI models.

The current open models will likely become "illegal". The makers would have to retroactively provide then necessary documentation, but why would they bother? Even if they did, there is the question about the policy regarding the opt-out. Mind, that doesn't outlaw possession or use. It simply means that businesses may be fined for providing downloads or inference to EU residents.

I think this will likely have a chilling effect on open models. EG Meta could simply say that "Llama 4", if it were to come, is off-limits in the EU. But that might not be enough to indemnify them. Or they could try to comply, which would cost them money for no clear gain. And/or they'd have to leave out data without regard for quality.

Research institutions are not bound by the opt-out. They might become a source of open models.

The carve-outs also do not apply to so-called high-risk AI systems. That would make sense if the act made sense.

LLMs, image diffusion models, and such are termed GPAI (general purpose AI). They are not considered high-risk, by default. They are considered high-risk only once they are adapted to a high-risk purpose. Of course, the line isn't that clear. Say, a teacher could use an LLM to grade tests. Regulators might cause problems there.

General_Effort ,

This is one of those cases where no one agrees what "the right thing" is. Owners think it's right that they collect rent from their property. Me, I think the wider interests of society take precedence.

When the copyright directive was passed in 2019, there was a lot of opposition to it. The guy who had a lot of say as a (sorta) committee chair was the same one who now oversaw the AI act. Few people at the time care that it regulated AI training. I think the lobbying came mainly from academics who understood that the oppressive IP laws in many EU countries made ML all but illegal. I'm sure, if the copyright industry had foreseen the importance of the AI training provisions, the situation would be much worse for the EU now.

Unfortunately, the people who might argue for the wider interests of society don't have the wherewithal to meaningfully contribute here. Few people know what AI is, and no one knows what it will be in a few years. There is a lot of rubbish in the act that will do more harm than good, in the name of protecting society. But because it is so ill thought out, I doubt it will do much either way. The copyright fanatics were the real damage dealers.

General_Effort ,

I can probably answer, but I'm not sure if I get the question. You want to know the banned practices?

General_Effort ,

Cool. I wrote a fairly lengthy post on that in the comments here.

I don't think they are in compliance with EU law. You can still possess and use them legally, but businesses that offer them to EU residents may be fined. Unless I missed something, private use, even professional use (EG for writing code) is fine. You wouldn't want to build a business around hosting open models in the EU, though.

General_Effort ,

Saying that it's "statistics" is, at best, unhelpful. It conveys no useful information. At worst, it's misleading. What goes on with neural nets has very little to do with what one learns in a stats course.

General_Effort ,

I would not expect almost human-like conversation on being told that is just statistics. I'd expect something like the old Markov chain jobs. What kind of knowledge leads you to have higher expectations?

Also, how does Bayesian statistics enter into this?

General_Effort ,

Yes, that's a valid comparison. It's worse with neural nets, though. Much of machine learning is literally applied statistics. That is, a program is written that applies statistical methods to data and then adjusts its behavior. So, saying that it's statistics has the potential to really send people down the wrong track. Many of the "human hallucinations" about AIs result from confusion about this.

General_Effort ,

Those aren't the basics, though. That's how saying it's statistics is misleading. A Bayesian network is not a neural network.

General_Effort ,

Refreshing to see a post on this topic that has its facts straight.

EU copyright allows a machine-readable opt-out from AI training (unless it's for scientific purposes). I guess that's behind these deals. It means they will have to pay off Reddit and the other platforms for access to the EU market. Or more accurately, EU customers will have to pay Reddit and the other platforms for access to AIs.

General_Effort ,

No one ever said ATM-code is law. Ethereum code is supposed to be. Code is law is one of their slogans.

Everything that a blockchain does could be handled by a single office computer. The whole reason for the huge, expensive over-head is to put crypto beyond the law. Stuff like this exposes the whole, huge waste of human effort.

General_Effort , (edited )

I'll try a simple explanation of what this is about, cause this is hilarious. It's the kind of understated humor, you get in a good british comedy.

For a payment system you must store who owns how much and how the owners transfer the currency. Easy-peasy. A simple office PC can handle that faster and cheaper than a blockchain. But what if the owner of the PC decides to manipulate the records? No problem, you just go to the police with your own records and receipts and they go to jail for fraud. Their belongings are sold off to pay you damages. That's how these things have worked since forever. It's how businesses keep track of their debts.

Just one little problem: What if the government wants your money. Maybe you don't want to pay your taxes, or some fine. Or maybe you have debts you don't want to pay, like your alimony. Perhaps the government wants to seize the proceeds from a drug deal. They can just go to the record keeper and force them to transfer currency.

This is where cryptocurrencies come to the rescue (as it were). There are different schemes. ETH (Ethereum) uses validators. The validators are paid to take care of the record-keeping. The trick is, that you have to put down ETH as a collateral (called staking) to run a validator. If you manipulate the record/blockchain, then the other validators will notice and raise the alarm. That results in you losing your collateral.

This means the validators can remain anonymous. You don't need to know their identities to punish them for fraud. You just take their crypto-money. They need to remain anonymous so that the government (or the mob) can't get to them.

This is where it gets hilarious. These 2 brothers operated fraudulent validators. The stake/the collateral didn't matter at all. The whole scheme didn't matter. It was a horrible waste of money and effort. The indictment even details how they tried to launder the crypto. That is, how they tried to transfer it, so that it couldn't be traced on the blockchain. The indictment even has the search queries they used to look up the info on how to do that.

The whole point of it all is that you supposedly do not need the government to prosecute anyone. If validators are kept honest by the threat of criminal prosecution, then you do not need the whole Proof of Stake scheme. You do not need the whole expensive overhead.

The only rational reason for crypto to exist, is to avoid laws; buying drugs and what not. I'm not judging. The hilarious fact is that the law knew everything about these guys.

It's all a sham. The one thing that crypto is supposed to do: Foil the government. And it doesn't work.


When people want to buy crypto on the blockchain, they put out a request so that a validator will execute that transaction and record it on the blockchain. So, while the request is waiting, a bot comes along and scans it. It may be that a purchase changes the exchange value of a currency. In that case, the bot adds 2 more transactions. First, to buy that currency before the original request, and to sell it afterward. The original request drives up the price in between the buy and sell, so that the bot makes a profit for its operator. The original request has to pay a little extra. That's where the profit comes from.

Sound shady? I hope not, because that's what the victims did.

The accused operated their own validators. At the right time, they put out their own buy request to lure in a bot. When the bot proposed the bundled transactions, their validators feigned acceptance. But then switched out the lure transaction of buying for selling.

The indictment makes a fairly good argument. It's like there is a "contract" between these automatic systems. The trading bot wants the bundled transactions to be carried out exactly so. The validator feigns agreement, but does not follow through.

General_Effort ,

It reminded me of high-frequency trading.


Mind, the people who do that are the victims here!

I didn't explain how exactly they were harmed. It's actually kinda funny, too.

It costs virtually nothing to create crypto-tokens. So that's what people do. Do some wash trades, slip some money to influencers to hype their new token as the next big thing, then offload the whole supply and run with the money. The "investors" quickly discover that these tokens are only good for one thing: To sell to a greater fool. At that point, there are no more buyers.

The accused obtained such useless tokens. The indictment doesn't say how. I guess they simply bought it for next to nothing.

Effectively, they tricked the victims' bots into buying these tokens at face value. The victims were left with crypto supposedly worth $25 million but in reality unsellable. If this was stealing $25 million, then I wonder about the legality of selling these crypto tokens in the first place.

Eventually, all crypto is like that. Some cryptocurrencies are used as payment systems, but eventually something better must come along. Then that currency becomes unsellable. Someone must always be left holding the bag, as it is said in crypto circles.

I think they are guilty of fraud. But I do wonder: If we are to accept that leaving someone with worthless crypto is equal to stealing money, what does that mean for the legality of crypto as a whole?

General_Effort ,

I thought the same thing, but mind: That's what the victims did. See my other reply going into this more.

General_Effort ,

I mean, yes, but I can't help pointing out that all money is made up. The only difference is the purpose of the money system.

General_Effort ,

I wish people would go straight to the source for these stories. No reason to link to something that only paraphrases a press release and adds some ads.

Press release (contains link to indictment):

https://www.justice.gov/usao-sdny/pr/two-brothers-arrested-attacking-ethereum-blockchain-and-stealing-25-million

General_Effort ,

That doesn't even make sense. I have the mild suspicion that the fossil fuel industry sponsors nonsense like that, as a distraction from sane measures.

What we need to do to stop global warming is very simple: Stop using fossil fuels. We must not add CO2 to the atmosphere.

AI has nothing to do with that. It's just one more use for electricity. If we wanted to stop global warming, we would get the electricity by saving elsewhere, or generating more carbon-neutral electricity, with solar, wind or what not. We simply chose not to do that.

Hello GPT-4o (openai.com)

GPT-4o (“o” for “omni”) is a step towards much more natural human-computer interaction—it accepts as input any combination of text, audio, and image and generates any combination of text, audio, and image outputs. It can respond to audio inputs in as little as 232 milliseconds, with an average of 320 milliseconds,...

General_Effort ,

Hugging Face is the usual platform for sharing datasets and models.

General_Effort ,

Good question. That is (almost certainly) political speech and as such especially protected by law. It's also quite controversial and so companies will try to prevent their services being used for it.

After announcing increased prices, Spotify to Pay Songwriters About $150 Million Less Next Year (www.billboard.com)

When Bloomberg reported that Spotify would be upping the cost of its premium subscription from $9.99 to $10.99, and including 15 hours of audiobooks per month in the U.S., the change sounded like a win for songwriters and publishers. Higher subscription prices typically equate to a bump in U.S. mechanical royalties — but not...

General_Effort ,

In 2023, Taylor Swift got $100 million from Spotify. How much should she get?

General_Effort ,

They are also retained by anyone who has archived them., like OpenAI or Google. Thus making their AIs more valuable.

To really pull up the ladder, they will have to protest the Internet Archive and Common Crawl, too. It's just typical right-wing bullshit; acting on emotion and against their own interests.

General_Effort ,

They are not. A derivative would be a translation, or theater play, nowadays, a game, or movie. Even stuff set in the same universe.

Expanding the meaning of "derivative" so massively would mean that pretty much any piece of code ever written is a derivative of technical documentation and even textbooks.

So far, judges simply throw out these theories, without even debating them in court. Society would have to move a lot further to the right, still, before these ideas become realistic.

General_Effort ,

The EU tends to be much harsher in these matters, though some members don't follow along.

General_Effort ,

The article alleges, though without evidence, that the tracking is just an excuse to raise rates.

A quick search didn't turn up quite the right statistics, but traffic fatalities have been seriously on the rise in the US. That probably implies higher payouts. (WP)

But also, when trackable unsafe drivers have to pay more (and trackable safe driver less), then the unsafe drivers will prefer to be untrackable. You may be on the receiving end of the recalculated actuary tables.

General_Effort ,

Behind every successful man there is a woman, making him look like he has a mullet.

General_Effort ,

Yes. It is a new tool for vfx artists and not a replacement. If they can deliver higher quality for less money, you'd expect them to be more in demand.

"Never" is a big word, but it's really not clear how one would train an AI to know what it should generate. See the hubbub about diversity in google's image generator. I see no theoretical problems, but in practice it's just not going to happen any time soon.

  • All
  • Subscribed
  • Moderated
  • Favorites
  • incremental_games
  • meta
  • All magazines