Welcome to Incremental Social! Learn more about this project here!
Check out lemmyverse to find more communities to join from here!

cbc.ca

Rentlar , to Technology in Canadian Surpreme Court Rules Police Now Need a Warrant to Get a Person's IP

The ruling said the privacy interests cannot be limited to what the IP address can reveal on its own "without consideration of what it can reveal in combination with other available information, particularly from third-party websites."

It went on to say that because an IP address unlocks a user's identity, it comes with a reasonable expectation of privacy and is therefore protected by the Charter.

Personally I agree with the majority opinion here. "For the safety of children and crime victims" is too often used as an excuse to unleash wide-reaching attacks on privacy.

Police will still be able to obtain the information they need when the cases involving children and victims of crime happen, they just need to get permission from the courts. This ruling seems to prevent law enforcement from doing an internet analogue of "carding", requesting and obtaining random Canadian IPs in search of something to prosecute.

autotldr Bot , to unions in Canada art gallery strike: $100 million and not thinking about employees, they know they're biggest game in town and they can push people around

This is the best summary I could come up with:


Hundreds of employees from the Art Gallery of Ontario (AGO) gathered on the picket line as they began strike action Tuesday.

"We anticipate hopefully that the employer sees our strength and that it starts an internal conversation with them to maybe rethink how they've been treating their employees," said Mark Thornberry, an event setup coordinator who has worked with the museum for 15 years.

OPSEU local president Paul Ayers says public service employees struggled through the COVID-19 pandemic and three years of wage freezes, and cannot afford to keep up with inflation.

In the AGO's most recent publicly available financial documents, which covers April 1, 2022 to March 31, 2023, the gallery reported a deficit of $3.8 million.

Standing at the front of the crowd Tuesday, Meagan Christou, a member of the bargaining team for OPSEU Local 535 and a worker with the AGO for seven years, led a chant of 'No Wages, No Art.'

Christou says their team is still hoping to bargain and don't want a strike to last too long, but that there needs to be movement on wages and precarious working conditions.


The original article contains 608 words, the summary contains 180 words. Saved 70%. I'm a bot and I'm open source!

Spacehooks , to Interesting Global News in Canadian DNA lab knew its paternity tests identified the wrong dads, but it kept selling them

Hope they get sued to hell

BrikoX , to Canada in Liberals accuse Conservatives of using AI for amendments to jobs bill as votes loom
@BrikoX@lemmy.zip avatar

This is basically a double insult. Either they did use it, or they didn't use it, and they are just that bad that people think it looks like it was.

avidamoeba ,
@avidamoeba@lemmy.ca avatar

Time for a new bill outlawing AI generated text from bills. 😂

autotldr Bot , to Technology in Minister suggests Canada is considering tariffs on Chinese EVs following U.S. move

🤖 I'm a bot that provides automatic summaries for articles:

Click here to see the summary

François-Philippe Champagne wouldn't rule out Canada imposing similar tariffs during an interview with CBC News Network's Power & Politics on Friday.

President Joe Biden announced earlier this week that the U.S. would be slapping new tariffs on Chinese electric vehicles (EVs), advanced batteries, solar cells, steel, aluminum and medical equipment.

There are currently very few EVs from China in the U.S., but American officials worry that low-priced models made possible by Chinese government subsidies could soon start flooding the U.S. market.

In a separate interview on Tuesday, Flavio Volpe, president of the Automotive Parts Manufacturers' Association, said "Canada has to" implement similar trade levies.

"Now that the Americans have put up a tariff wall, we can't leave the side door open here," Volpe told guest host John Paul Tasker.

The federal government has partnered with provinces to attract investments from major automotive manufacturers to spur electric vehicle production in Canada.


Saved 72% of original text.

BrikoX Mod , to Technology in Minister suggests Canada is considering tariffs on Chinese EVs following U.S. move
@BrikoX@lemmy.zip avatar

Biden strongly opposed increasing tariffs on EVs when he first came into office because they would lead to huge price increases for consumers and backfire. And now he changed his minded, but the consequences didn't. Canada getting in between US and China in this trade war is a bad idea.

tardigrada OP ,

This is not 'only' about trade or dominance in a particular market such as EVs or solar panels. China aims to leverage market dominance for political influence. The Chinese government wants to export not just products but its autocratic system.

BrikoX Mod ,
@BrikoX@lemmy.zip avatar

Either China sucks at it or that's not their goal, since they have been trading with each other for 40+ years.

And US welcomes its autocrats like Donald Trump on its own, they don't need China for that.

tardigrada OP ,

There is much evidence about this and a strong body of research. As researcher in the Journal of Democracy write, for example:

China’s Threat to Global Democracy (here is the [archived link](China’s Threat to Global Democracy))

China’s economy is slowing, and the regime is coming under greater domestic pressure—witness the large-scale protests that broke out against Xi’s covid-zero policy in multiple cities and on dozens of university campuses in late 2022. Beijing is encountering growing international criticism and resistance on other fronts as well. Around the world, negative views of China have surged to highs not seen since the 1989 Tiananmen Square Massacre [...]

China’s rulers also have long understood what political scientists have proven empirically: Autocracies often fall in waves, as revolutionary activity in one country inspires popular uprisings in others [...]

The CCP has responded with stepped-up repression over the past decade—jailing dissidents, mobilizing security forces, censoring information, and preempting popular unrest. Yet China is now strong enough that it can do more than just hunker down in the face of foreign pressure. Xi believes that the CCP’s domestic power will be enhanced if authoritarianism is prevalent and democracies are dysfunctional—fellow despots will not punish China for rights abuses, and the Chinese people will not want to emulate the chaos of liberal systems. He thinks that preventing revolts against authoritarianism in other countries will lower the odds of such a revolt erupting in China. And he believes that silencing critics abroad will limit the challenges facing the CCP within China. Xi sees rolling back democracy overseas as part of his plan to secure his regime at home [...]

Beijing spends billions of dollars annually on an “antidemocratic toolkit” of nongovernmental organizations, media outlets, diplomats, advisors, hackers, and bribes all designed to prop up autocrats and sow discord in democracies. The CCP provides fellow autocracies with guns, money, and protection from UN censure while slapping foreign human-rights advocates with sanctions. Chinese officials offer their authoritarian brethren riot-control gear and advice on building a surveillance state; PRC trade, investment, and loans allow those dictators to avoid Western conditionality regarding anticorruption or good governance.

Beijing uses its globe-spanning media organs to tout the accomplishments of illiberal rule while highlighting democratic governments’ flaws and hypocrisies. China works with fellow authoritarian regimes, such as Vladimir Putin’s in Russia, to push autocrat-friendly norms of internet management in international institutions and standards-setting bodies.

These are some quotes, but whole article makes an interesting read.

BrikoX Mod ,
@BrikoX@lemmy.zip avatar

It was an interesting read. Thanks.

And while there are many points I agree with, the core principle that China wants to use trade to export autocracy just doesn't mesh. Trade is the best way to prevent military conflicts since trade creates dependencies which would be cut off during any military action. I would say the opposite is happening now with the current wave of protectionism in both the US and China.

China will take opportunities to insert themselves in other countries when it suits them, but it's no different from what democratic countries do. Just look at Africa and how to this day France refuses to abandon its economic influence they hold via CFA franc currency. Or US trying to insert democratic rule in countries that have strategic value to them by any means necessary (the publication mentions US abandoned strategy), but we know it's not true. There are public details about US coup attempts in Venezuela just recently.

PerogiBoi , to Technology in Minister suggests Canada is considering tariffs on Chinese EVs following U.S. move
@PerogiBoi@lemmy.ca avatar

I’m in the market for a new car in Canada. I don’t want a gas car. Environmental reasons aside, they just cost too much and have too many moving parts that can wear out.

I can afford most any gas car I want. The moment it becomes electric I cannot afford it. A $10k BYD looks nice, but my government decides I actually need to pay $45k starting instead.

Dark_Arc ,
@Dark_Arc@social.packetloss.gg avatar

The reason the US and Canadian governments are doing this is to stop that $10k car from destroying the auto motive industry in North America resulting in layoffs that make the recent tech layoffs look like peanuts.

I agree we need cheaper EVs in North America, I want one too... There's an Ars Technica article where Ford basically goes "we thought everyone wanted expensive trucks ... we made those electric ... we realize we missed the mark, we're going to work on smaller, cheaper, EVs." So, they are coming hopefully within the next couple of years.

I'm not sure how important manufacturing still is to the Canadian economy, but for the US economy ... trying to protect domestic production is important (and we should've done it years ago instead of letting cheap Chinese imports destroy a large amount of the factories in North America).

JayTreeman ,

These NA car companies always get bailed out. They should've been making kei trucks, and small electric cars for ages, but they don't need to because they'll just get bailed out if they fail.

These tariffs are another form of a bailout. Maybe instead of bailing these guys out we should nationalize them.

Dark_Arc ,
@Dark_Arc@social.packetloss.gg avatar

That reduces a lot of relevant context, like why they needed the 08 bailouts in the first place, how many times they've been bailed out, and the fact that China has heavily subsidized these cars to the point that even if they were making the same vehicle, it would be significantly more expensive.

JayTreeman ,

The US manufacturers get bailed out every 30 years or so, so why there's a specific reason the most recent time becomes less relevant.
What makes BYD specifically so cheap isn't the government subsidies, which NA manufacturers also get, but it's vertical integration. BYD is a battery company that started to make cars. They can sell the batteries to the car side of the businesd below cost as long as the final product makes the larger corporation a profit.
Tesla for example buys it's batteries from Panasonic. Panasonic has to make a profit. That makes the Tesla much more expensive than it should be.

Thevenin ,

Transportation is a necessity, and I believe every inelastic market deserves a nationalized alternative to prevent price gouging. Like how the USPS keeps UPS and FEDEX in line. With that being said, nationalization doesn't fix this particular problem.

China is run like a giant capitalist cartel (in all but name), and appropriately, their ultimate weapon in their hunt for global monopolies is the provision of slave labor. The number of slaves in Xinjiang alone is estimated in the hundreds of thousands, and their labor has been credibly linked to the production of cotton (face masks), polysilicon (solar panels), and aluminum and lithium (EVs).

It's no coincidence that these are the industries being slapped with tariffs. No amount of subsidization or nationalization can level a playing field that's been tilted by slavery. You don't outcompete slavery, you either penalize goods suspected of involving it, or you go full John Brown.

JayTreeman ,

I agree, but that's a slippery slope. Lots of countries use slavery to make cheap goods.
But yes. Slavery in all forms should be abolished

PerogiBoi ,
@PerogiBoi@lemmy.ca avatar

I get there’s protectionism of local industry, but clearly the market doesn’t want what the industry is making. We are held captive to whatever the industry thinks we want. It’s not a real free market. We are prescribed options to bail out the fledgling automotive industry (whatever is left of it after outsourcing everything to Mexico and SEA)

Thalestr , to Technology in Minister suggests Canada is considering tariffs on Chinese EVs following U.S. move
@Thalestr@beehaw.org avatar

Goddamn, I wish we could act like our own independent country for once instead of just puppeting whatever the US does.

TheDudeV2 OP , to Technology in Neuralink looks to the public to solve a seemingly impossible problem

I’m not an Information Theory guy, but I am aware that, regardless of how clever one might hope to be, there is a theoretical limit on how compressed any given set of information could possibly be; and this is particularly true for the lossless compression demanded by this challenge.

Quote from the article:

The skepticism is well-founded, said Karl Martin, chief technology officer of data science company Integrate.ai. Martin's PhD thesis at the University of Toronto focused on data compression and security.

Neuralink's brainwave signals are compressible at ratios of around 2 to 1 and up to 7 to 1, he said in an email. But 200 to 1 "is far beyond what we expect to be the fundamental limit of possibility."

orclev ,

The implication of a 200 to 1 algorithm would be that the data they're collecting is almost entirely noise. Specifically that 99.5% of all the data is noise. In theory if they had sufficient processing in the implant they could filter the data down before transmission thus reducing the bandwidth usage by 99.5%. It seems like it would be fairly trivial to prove that any such 200 to 1 compression algorithm would be indistinguishable in function from a noise filter on the raw data.

It's not quite the same situation, but this should show some of the issues with this: https://matt.might.net/articles/why-infinite-or-guaranteed-file-compression-is-impossible/

Stardust ,

There is a way they could make the majority of it noise - if they reduced their expectations to only picking up a single type of signal, like thinking of pressing a red button, and tossing anything that doesn't roughly match that signal. But then they wouldn't have their super fancy futuristic human-robot mind meld dream, or dream of introducing a dystopian nightmare where the government can read your thoughts...

Paragone ,

The problem isn't "making the majority of it noise",

the problem is tossing-out the actual-noise, & compressing only the signal.

Without knowing what the actual-signal is, & just trying to send all-the-noise-and-signal, they're creating their problem, requiring 200x compression, through wrongly-framing the question.

What they need to actually do, is to get a chip in before transmitting, which does the simplification/filtering.

That is the right problem.

That requires some immense understanding of the signal+noise that they're trying to work on, though, and it may require much more processing-power than they're committed to permitting on that side of the link.

shrug

Universe can't care about one's feelings: making-believing that reality is other than it actually-is may, with politial-stampeding, dent reality some, temporarily, but correction is implacable.

In this case, there's nothing they can do to escape the facts.

EITHER they eradicate enough of the noise before transmission,

XOR they transmit the noise, & hit an impossible compression problem.

Tough cookies.

_ /\ _

QuadratureSurfer ,
@QuadratureSurfer@lemmy.world avatar

NAND - one of the 2 you listed, or they give up.

Death_Equity ,

Absolutely, they need a better filter and on-board processing. It is like they are just gathering and transmitting for external processing instead of cherry picking the data matching an action that is previously trained and sending it as an output.

I'm guessing they kept the processing power low because of heat or power availability, they wanted to have that quiet "sleek" puck instead of a brick with a fanned heatsink. Maybe they should consider a jaunty hat to hide the hardware.

Gathering all the data available has future utility, but their data transmission bottleneck makes that capability to gather data worthless. They are trying to leap way too far ahead with too high of a vanity prioritization and getting bit for it, about par for the course with an Elon project.

Miaou ,

Ugh? That's not what it means at all. Compression saves on redundant data, but it doesn't mean that data is noise. Or are you using some definition of noise I'm not aware of?

TheDudeV2 OP ,

I can try to explain, but there are people who know much more about this stuff than I do, so hopefully someone more knowledgeable steps in to check my work.

What does ‘random’ or ‘noise’ mean? In this context, random means that any given bit of information is equally as likely to be a 1 or a 0. Noise means a collection of information that is either random or unimportant/non-useful.

So, you say “Compression saves on redundant data”. Well, if we think that through, and consider the definitions I’ve given above, we will reason that ‘random noise’ either doesn’t have redundant information (due to the randomness), or that much of the information is not useful (due to its characteristic as noise).

I think that’s what the person is describing. Does that help?

Miaou ,

I agree with your point, but you're arguing that noise can be redundant data. I am arguing that redundant data is not necessarily noise.

In other words, a signal can never be filtered losslessly. You can slap a low pass filter in front of the signal and call it a day, but there's loss, and if lossless is a hard requirement then there's absolutely nothing you can do but work on compressing redundant data through e.g. patterns, interpolation, what have you (I don't know much about compression algos).

A perfectly noise free signal is arguably easier to compress actually as the signal is more predictable.

Cocodapuf ,

I'm not sure that's accurate.

Take video for example. Using different algorithms you can get a video down half the file size of the original. But with another algorithm you can get it down to 1/4 another can get it down to 1/10. If appropriate quality settings are used, the highly compressed video can look just as good as the original. The algorithm isn't getting rid of noise, it's finding better ways to express the data. Generally the fancier the algorithm, the more tricks it's using, the smaller you can get the data, but it's also usually harder to unpack.

orclev ,

It's important to distinguish between lossy and lossless algorithms. What was specifically requested in this case is a lossless algorithm which means that you must be able to perfectly reassemble the original input given only the compressed output. It must be an exact match, not a close match, but absolutely identical.

Lossless algorithms rely generally on two tricks. The first is removing common data. If for instance some format always includes some set of bytes in the same location you can remove them from the compressed data and rely on the decompression algorithm to know it needs to reinsert them. From a signal theory perspective those bytes represent noise as they don't convey meaningful data (they're not signal in other words).

The second trick is substituting shorter sequences for common longer ones. For instance if you can identify many long sequences of data that occur in multiple places you can create a lookup index and replace each of those long sequences with the shorter index key. The catch is that you obviously can't do this with every possible sequence of bytes unless the data is highly regular and you can use a standardized index that doesn't need to be included in the compressed data. Depending on how poorly you do in selecting the sequences to add to your index, or how unpredictable the data to be compressed is you can even end up taking up more space than the original once you account for the extra storage of the index.

From a theory perspective everything is classified as either signal or noise. Signal has meaning and is highly resistant to compression. Noise does not convey meaning and is typically easy to compress (because you can often just throw it away, either because you can recreate it from nothing as in the case of boilerplate byte sequences, or because it's redundant data that can be reconstructed from compressed signal).

Take for instance a worst case scenario for compression, a long sequence of random uniformly distributed bytes (perhaps as a one time pad). There's no boilerplate to remove, and no redundant data to remove, there is in effect no noise in the data only signal. Your only options for compression would be to construct a lookup index, but if the data is highly uniform it's likely there are no long sequences of repeated bytes. It's highly likely that you can create no index that would save any significant amount of space. This is in effect nearly impossible to compress.

Modern compression relies on the fact that most data formats are in fact highly predictable with lots of trimmable noise by way of redundant boilerplate, and common often repeated sequences, or in the case of lossy encodings even signal that can be discarded in favor of approximations that are largely indistinguishable from the original.

Waldowal ,
@Waldowal@lemmy.world avatar

I'm no expert in this subject either, but a theoretical limit could be beyond 200x - depending on the data.

For example, a basic compression approach is to use a lookup table that allows you to map large values to smaller lookup ids. So, if the possible data only contains 2 values: One consisting of 10,000 letter 'a's. The other is 10,000 letter 'b's. We can map the first to number 1 and the second to number 2. With this lookup in place, a compressed value of "12211" would uncompress to 50,000 characters. A 10,000x compression ratio. Extrapolate that example out and there is no theoretical maximum to the compression ratio.

But that's when the data set is known and small. As the complexity grows, it does seem logical that a maximum limit would be introduced.

So, it might be possible to achieve 200x compression, but only if the complexity of the data set is below some threshold I'm not smart enough to calculate.

QuadratureSurfer , (edited )
@QuadratureSurfer@lemmy.world avatar

You also have to keep in mind that, the more you compress something, the more processing power you're going to need.

Whatever compression algorithm that is proposed will also need to be able to handle the data in real-time and at low-power.

But you are correct that compression beyond 200x is absolutely achievable.

A more visual example of compression could be something like one of the Stable Diffusion AI/ML models. The model may only be a few Gigabytes, but you could generate an insane amount of images that go well beyond that initial model size. And as long as someone else is using the same model/input/seed they can also generate the exact same image as someone else.
So instead of having to transmit the entire 4k image itself, you just have to tell them the prompt, along with a few variables (the seed, the CFG Scale, the # of steps, etc) and they can generate the entire 4k image on their own machine that looks exactly the same as the one you generated on your machine.

So basically, for only a few bits about a kilobyte, you can get 20+MB worth of data transmitted in this way. The drawback is that you need a powerful computer and a lot of energy to regenerate those images, which brings us back to the problem of making this data conveyed in real-time while using low-power.

Edit:

Tap for some quick napkin math

For transmitting the information to generate that image, you would need about 1KB to allow for 1k characters in the prompt (if you really even need that),
then about 2 bytes for the height,
2 for the width,
8 bytes for the seed,
less than a byte for the CFG and the Steps (but we'll just round up to 2 bytes).
Then, you would want something better than just a parity bit for ensuring the message is transmitted correctly, so let's throw on a 32 or 64 byte hash at the end...
That still only puts us a little over 1KB (1078Bytes)...
So for generating a 4k image (.PNG file) we get ~24MB worth of lossless decompression.
That's 24,000,000 Bytes which gives us roughly a compression of about 20,000x
But of course, that's still going to take time to decompress as well as a decent spike in power consumption for about 30-60+ seconds (depending on hardware) which is far from anything "real-time".
Of course you could also be generating 8k images instead of 4k images... I'm not really stressing this idea to it's full potential by any means.

So in the end you get compression at a factor of more than 20,000x for using a method like this, but it won't be for low power or anywhere near "real-time".

Cosmicomical ,

just have to tell them the prompt, along with a few variables

Before you can do that, you have to spend hours of computation to figure out a prompt and a set of variables that perfectly match the picture you want to transmit.

QuadratureSurfer ,
@QuadratureSurfer@lemmy.world avatar

Sure, but this is just a more visual example of how compression using an ML model can work.

The time you spend reworking the prompt, or tweaking the steps/cfg/etc. is outside of the scope of this example.

And if we're really talking about creating a good pic it helps to use tools like control net/inpainting/etc... which could still be communicated to the receiving machine, but then you're starting to lose out on some of the compression by a factor of about 1KB for every additional additional time you need to run the model to get the correct picture.

Cosmicomical ,

You are removing the most computationally intensive part of the process in your example, that's making it sound easy, while adding it back shows that your process is not practical.

QuadratureSurfer ,
@QuadratureSurfer@lemmy.world avatar

The first thing I said was, "the more you compress something, the more processing power you're going to need [to decompress it]"

I'm not removing the most computationally expensive part by any means and you are misunderstanding the process if you think that.

That's why I specified:

The drawback is that you need a powerful computer and a lot of energy to regenerate those images, which brings us back to the problem of making this data conveyed in real-time while using low-power.

And again

But of course, that's still going to take time to decompress as well as a decent spike in power consumption for about 30-60+ seconds (depending on hardware)

Those 30-60+ second estimates are based on someone using an RTX 4090, the top end Consumer grade GPU of today. They could speed up the process by having multiple GPUs or even enterprise grade equipment, but that's why I mentioned that this depends on hardware.

So, yes, this very specific example is not practical for Neuralink (I even said as much in my original example), but this example still works very well for explaining a method that can allow you a compression rate of over 20,000x.

Yes you need power, energy, and time to generate the original image, and yes you need power, energy, and time to regenerate it on a different computer. But to transmit the information needed to regenerate that image you only need to convey a tiny message.

Cocodapuf ,

Neurons work in analogue data, I'm not sure lossless algorithms are necessary.

dullbananas , to Technology in Neuralink looks to the public to solve a seemingly impossible problem
@dullbananas@lemmy.ca avatar

Submit your algorithms under GPL

potatopotato ,

AGPL just in case they try to put your brain waves into the cloud

orclev ,

GPLv3, make it really radioactive to them.

random_character_a , to Technology in Neuralink looks to the public to solve a seemingly impossible problem
@random_character_a@lemmy.world avatar

Zombie signal not strong enough to make people think Elon is excentric genius and not an loud moron?

palordrolap , to Technology in Neuralink looks to the public to solve a seemingly impossible problem

Surprised they haven't tried to train a neural network to find a compression algorithm specifically for their sort of data.

There's a ridiculous irony in the fact they haven't, and it's still ironic even if they have and have thrown the idea out as a failure. Or a dystopian nightmare.

But if it is the latter, they might help save time and effort by telling "the public" what avenues have already failed, or that they don't want purely AI-generated solutions. Someone's bound to try it otherwise.

orclev ,

They did, but then Elon insisted they add a virtual neuralink into it and now the neural network is braindead.

kibiz0r , to Technology in Neuralink looks to the public to solve a seemingly impossible problem

How do you send 200x as much data?

You don’t. The external system needs to run an approximation of the internal system, which the internal system will also run and only transmit differences.

There you go. Solved it. (By delegating to a new problem.)

dariusj18 , to Technology in Neuralink looks to the public to solve a seemingly impossible problem

Did they try Stack overflow?

RobotZap10000 ,

Why would you ever want to do that?! Marked as duplicated. Shove a cactus up your ass.

bus_factor ,

Why not skip the middle man and ask ChatGPT directly?

billiam0202 ,

*GrokAI

You know, Xitter's shittier AI.

bus_factor ,

Fair, I was thinking in the context of Stack Overflow.

SuperFola , to Technology in Neuralink looks to the public to solve a seemingly impossible problem
@SuperFola@programming.dev avatar

Just add 199 more transmistters

  • All
  • Subscribed
  • Moderated
  • Favorites
  • random
  • incremental_games
  • meta
  • All magazines