Welcome to Incremental Social! Learn more about this project here!
Check out lemmyverse to find more communities to join from here!

cbc.ca

kautau , to Technology in SUV stolen from Toronto driveway shows up 50 days later — AirTags tracked vehicle from Canada to Middle East, offering glimpse into shipping routes used by car thieves

Andrew received a picture taken from inside a police car, parked near two containers sitting on a railcar. "It's definitely in one of those containers," the officer said in a series of text messages viewed by CBC News. But the York officer said they didn't "have the authority to open the containers." Instead, they directed Andrew to the railway's private police service.

Andrew said CPKC police didn't respond to the scene that night and the train carrying his truck took off soon after. "That's the pinnacle of the frustration," Andrew told CBC, "knowing that it's still here, but it's about to disappear."

CPKC spokesperson Terry Cunha declined to discuss the incident, but said in a statement the railway "works with federal, provincial and local law enforcement agencies executing a number of strategies to identify and recover stolen vehicles."

Someone’s palms are real greasy here

Kecessa ,

"Wait, you expect us to actually do our job :( "

It's a PRIVATE police service, the PUBLIC police force shouldn't have to ask anything from them and should be laughing in their face as they're opening containers.

SomeKindaName ,

They should need to get a warrant, but that doesn't sound too hard in this instance.

shasta ,

Yep and judges don't work nights. Nothing for the police to do in this instance except wait for the next day... And then it was gone.

dullbananas , to Technology in Neuralink looks to the public to solve a seemingly impossible problem
@dullbananas@lemmy.ca avatar

Submit your algorithms under GPL

potatopotato ,

AGPL just in case they try to put your brain waves into the cloud

orclev ,

GPLv3, make it really radioactive to them.

aelwero , to Technology in No more Pornhub? That will depend on what happens with a Senate bill

options include the establishment of a digital ID system or services that can estimate an individual's age based on a visual scan of their face.

Oh yeah... I'm sure that won't cause any unintended issues at all...

pineapplelover ,

The future is back to playboy magazines

Max_P , to Technology in No more Pornhub? That will depend on what happens with a Senate bill
@Max_P@lemmy.max-p.me avatar

In a world where there's VPN ads literally everywhere and even bitcoin ATMs, I'm sure that will be of tremendous effectiveness

ininewcrow ,
@ininewcrow@lemmy.ca avatar

Just like the pop up notice when you visit pornhub for the first time and asks you to click whether you are 18 or not.

Which makes me wonder what happens when you click that you're not.

DarkMessiah ,

I clicked it out of curiosity, once. It just took me to Google.

eager_eagle ,
@eager_eagle@lemmy.world avatar

I'd like to try that too, but unfortunately I'm over 18.

Hadriscus ,

You could get a child to click that button for you

hakunawazo ,

So it would just be fair if all users on Google with wrong age would be linked to PH in return. (jk)

Weslee ,

I gotta wonder if these people know that these kinds of laws will do nothing, and they are just pandering, or if they actually think this time they got it

Max_P ,
@Max_P@lemmy.max-p.me avatar

I think they all just fundamentally don't understand how the Internet works and how it doesn't care about borders.

They approach it like companies are providing services to users directly like you just walked into a store and they're in full control of everything. Like companies are explicitly entering all the markets worldwide by being available on the Internet and providing their services to users. Obviously if you provide services to Canadian users you must be a company with a presence on Canadian soils.

Except you can't exactly put customs on the Internet like you can block sketchy imports from China when they arrive at the border. It literally crosses the border at the speed of light.

aelwero ,

I'm reasonably certain that once enough governments jump on the "we need to control the internets" bandwagon, there will be a region specific convention adopted similar to country codes for phone numbers so that they can, in fact, apply customs to it...

I suspect it won't be in the name of righteousness though, more likely it'll be taxes, copyright, etc, on internet sales that trigger it.

jaybone ,

Don’t IP addresses already provide you with region info?

Weslee ,

America's IP addresses are governed by ARIN, how long until you see that name in headlines for controversy...

Weslee ,

But somewhere down the line someone knows, either the lawmakers, or the advisors, or maybe they all know and it's just grandstanding to those of the public that don't know.

Really all this does is train the people with the drive or ability to learn things like DevOps to be even better at circumventing it, well this is not that hard, but generally, laws like this.

tsonfeir ,
@tsonfeir@lemm.ee avatar

VPN bans are next.

iopq ,

How do you ban a VPN? If I connect to a server that's a big cloud provider, how do you know the connection is to a VPN or a website?

anyhow2503 ,

It's not feasible to prevent it completely, but you can certainly make it harder for the average person and discourage usage by simply outlawing it. That's what China is doing at least.

iopq ,

China is not successful in it, since you can still use a VPN in China. You can even self host

anyhow2503 ,

Somehow I knew when I wrote that comment that someone would interpret "it's possible to discourage VPN usage and make it harder for the layman" as "it's possible to prevent VPN usage completely and China is 100% successful at doing that". China hasn't gone all in on blocking VPN traffic either way, since corporations can still use them and tourists don't like having their internet connection dropped without warning (which they actually did at one point), but someday they might and it will probably be enough to prevent the majority from using VPNs to circumvent government censorship.

iopq ,

It's technically impossible without blocking cloudflare and breaking most of the internet. You can route your VPN through cloudflare, so there's that

tsonfeir ,
@tsonfeir@lemm.ee avatar

They just make it a law and use it against people at will. They make it the ISPs responsibility for blocking and tracking access. They ban vpn software and all the corporate OS makers obey.

You’re right in a way, how do they really have a 100% block? It’s not possible. But they can scare 90% of users away.

iopq ,

Again, how do you track a connection to a data center? VPN software is freely available on GitHub. Are you blocking GitHub too?

Even China can't do it fully

tsonfeir ,
@tsonfeir@lemm.ee avatar

Again, if vpns are banned, and vpn software is banned, all US companies will have to abide or die. GitHub is American. Apple, Microsoft, Google, etc. Windows, macOS, Android, could all be forced to report. The year of the Linux desktop!!

Your ISP knows where you go. They just pull the plug on IPs they know are VPNs.

iopq ,

No, nobody cares about Canadian law

How do you know an IP is a VPN? I can change the address of my VPS at any time

spencer ,

Pornhub might care about Canadian law, seeing as they’re a Canadian company

iopq ,

They will comply, but the users won't. Pornhub can't tell you're using a VPN

tsonfeir ,
@tsonfeir@lemm.ee avatar

Jfc. If you’re running your own vpn then you’re fine. Unless your host is in the States, and then installing it would be a violation. Detecting the presence of vpn software on a vps would be cake.

Would they do it? Probably not.

It’s less about would it and more about could it. I think it would be hilarious to watch any government spend time on this instead of… actual shit.

iopq ,

It's a Canadian law, wouldn't affect America. America banning VPNs would be much more dire

Also having a VPN in the country where pornhub is banned would be pointless for the purpose of going to pornhub

tsonfeir ,
@tsonfeir@lemm.ee avatar

Right, Canada. I mean Canada.

frezik ,

That's not feasible. A lot of companies have VPNs to protect their own networks. This increased with work from home during the pandemic. There are too many domino effects.

With SSH and an AWS instance, I can create my own VPN. It's not that hard with a bit of Linux experience. Canada would be about as successful at this as the US was at keeping PGP away from foreign exports.

tsonfeir ,
@tsonfeir@lemm.ee avatar

Government can do whatever it wants.

frezik ,

Not really. The definition of failed state is when people openly ignore the government.

tsonfeir ,
@tsonfeir@lemm.ee avatar

What is it called when the government openly ignores itself?

nihilvain ,

With deep packet inspection they can detect a VPN protocol connection attempt and drop it.
There are already countries utilizing this method.

iopq ,

Only works if you use a protocol that can be detected.

https://github.com/XTLS/Xray-core/ has been defeating Chinese censors

nihilvain ,

Yes, my point is that banning protocols will kill all the commercial VPN offerings. Restricting a big size of the population. Obscure protocols like X-Ray can work but not everyone can set it up.

And I think you can also raise some suspicion if you use too much bandwidth on that connection.
GBs of data consumption from MyTotallyLegitWebsite.me can raise eyebrows.
And that would be the only thing needed for a court notice or a visit by the police, depending upon the country.
And in anti-democratic countries you're guilty until proven innocent anyway.

iopq ,

There are commercial xray servers where you just share them with people for a subscription price and it automatically updates with the freshest info

GBs of data from a connection is not that uncommon. There's a thing called cloudflare and you might already be hitting those IPs for gigabytes per month. You can route the VPN through cloudflare so it just looks like you're visiting a lot of websites hosted by cloudflare

Luftruessel , to Technology in No more Pornhub? That will depend on what happens with a Senate bill

As some wise man once said:

I'm fairly sure if they took porn off the internet, there'd only be one website left, and it'd be called "Bring back the porn!"

dariusj18 , to Technology in Neuralink looks to the public to solve a seemingly impossible problem

Did they try Stack overflow?

RobotZap10000 ,

Why would you ever want to do that?! Marked as duplicated. Shove a cactus up your ass.

bus_factor ,

Why not skip the middle man and ask ChatGPT directly?

billiam0202 ,

*GrokAI

You know, Xitter's shittier AI.

bus_factor ,

Fair, I was thinking in the context of Stack Overflow.

TheDudeV2 OP , to Technology in Neuralink looks to the public to solve a seemingly impossible problem

I’m not an Information Theory guy, but I am aware that, regardless of how clever one might hope to be, there is a theoretical limit on how compressed any given set of information could possibly be; and this is particularly true for the lossless compression demanded by this challenge.

Quote from the article:

The skepticism is well-founded, said Karl Martin, chief technology officer of data science company Integrate.ai. Martin's PhD thesis at the University of Toronto focused on data compression and security.

Neuralink's brainwave signals are compressible at ratios of around 2 to 1 and up to 7 to 1, he said in an email. But 200 to 1 "is far beyond what we expect to be the fundamental limit of possibility."

orclev ,

The implication of a 200 to 1 algorithm would be that the data they're collecting is almost entirely noise. Specifically that 99.5% of all the data is noise. In theory if they had sufficient processing in the implant they could filter the data down before transmission thus reducing the bandwidth usage by 99.5%. It seems like it would be fairly trivial to prove that any such 200 to 1 compression algorithm would be indistinguishable in function from a noise filter on the raw data.

It's not quite the same situation, but this should show some of the issues with this: https://matt.might.net/articles/why-infinite-or-guaranteed-file-compression-is-impossible/

Stardust ,

There is a way they could make the majority of it noise - if they reduced their expectations to only picking up a single type of signal, like thinking of pressing a red button, and tossing anything that doesn't roughly match that signal. But then they wouldn't have their super fancy futuristic human-robot mind meld dream, or dream of introducing a dystopian nightmare where the government can read your thoughts...

Paragone ,

The problem isn't "making the majority of it noise",

the problem is tossing-out the actual-noise, & compressing only the signal.

Without knowing what the actual-signal is, & just trying to send all-the-noise-and-signal, they're creating their problem, requiring 200x compression, through wrongly-framing the question.

What they need to actually do, is to get a chip in before transmitting, which does the simplification/filtering.

That is the right problem.

That requires some immense understanding of the signal+noise that they're trying to work on, though, and it may require much more processing-power than they're committed to permitting on that side of the link.

shrug

Universe can't care about one's feelings: making-believing that reality is other than it actually-is may, with politial-stampeding, dent reality some, temporarily, but correction is implacable.

In this case, there's nothing they can do to escape the facts.

EITHER they eradicate enough of the noise before transmission,

XOR they transmit the noise, & hit an impossible compression problem.

Tough cookies.

_ /\ _

QuadratureSurfer ,
@QuadratureSurfer@lemmy.world avatar

NAND - one of the 2 you listed, or they give up.

Death_Equity ,

Absolutely, they need a better filter and on-board processing. It is like they are just gathering and transmitting for external processing instead of cherry picking the data matching an action that is previously trained and sending it as an output.

I'm guessing they kept the processing power low because of heat or power availability, they wanted to have that quiet "sleek" puck instead of a brick with a fanned heatsink. Maybe they should consider a jaunty hat to hide the hardware.

Gathering all the data available has future utility, but their data transmission bottleneck makes that capability to gather data worthless. They are trying to leap way too far ahead with too high of a vanity prioritization and getting bit for it, about par for the course with an Elon project.

Miaou ,

Ugh? That's not what it means at all. Compression saves on redundant data, but it doesn't mean that data is noise. Or are you using some definition of noise I'm not aware of?

TheDudeV2 OP ,

I can try to explain, but there are people who know much more about this stuff than I do, so hopefully someone more knowledgeable steps in to check my work.

What does ‘random’ or ‘noise’ mean? In this context, random means that any given bit of information is equally as likely to be a 1 or a 0. Noise means a collection of information that is either random or unimportant/non-useful.

So, you say “Compression saves on redundant data”. Well, if we think that through, and consider the definitions I’ve given above, we will reason that ‘random noise’ either doesn’t have redundant information (due to the randomness), or that much of the information is not useful (due to its characteristic as noise).

I think that’s what the person is describing. Does that help?

Miaou ,

I agree with your point, but you're arguing that noise can be redundant data. I am arguing that redundant data is not necessarily noise.

In other words, a signal can never be filtered losslessly. You can slap a low pass filter in front of the signal and call it a day, but there's loss, and if lossless is a hard requirement then there's absolutely nothing you can do but work on compressing redundant data through e.g. patterns, interpolation, what have you (I don't know much about compression algos).

A perfectly noise free signal is arguably easier to compress actually as the signal is more predictable.

Cocodapuf ,

I'm not sure that's accurate.

Take video for example. Using different algorithms you can get a video down half the file size of the original. But with another algorithm you can get it down to 1/4 another can get it down to 1/10. If appropriate quality settings are used, the highly compressed video can look just as good as the original. The algorithm isn't getting rid of noise, it's finding better ways to express the data. Generally the fancier the algorithm, the more tricks it's using, the smaller you can get the data, but it's also usually harder to unpack.

orclev ,

It's important to distinguish between lossy and lossless algorithms. What was specifically requested in this case is a lossless algorithm which means that you must be able to perfectly reassemble the original input given only the compressed output. It must be an exact match, not a close match, but absolutely identical.

Lossless algorithms rely generally on two tricks. The first is removing common data. If for instance some format always includes some set of bytes in the same location you can remove them from the compressed data and rely on the decompression algorithm to know it needs to reinsert them. From a signal theory perspective those bytes represent noise as they don't convey meaningful data (they're not signal in other words).

The second trick is substituting shorter sequences for common longer ones. For instance if you can identify many long sequences of data that occur in multiple places you can create a lookup index and replace each of those long sequences with the shorter index key. The catch is that you obviously can't do this with every possible sequence of bytes unless the data is highly regular and you can use a standardized index that doesn't need to be included in the compressed data. Depending on how poorly you do in selecting the sequences to add to your index, or how unpredictable the data to be compressed is you can even end up taking up more space than the original once you account for the extra storage of the index.

From a theory perspective everything is classified as either signal or noise. Signal has meaning and is highly resistant to compression. Noise does not convey meaning and is typically easy to compress (because you can often just throw it away, either because you can recreate it from nothing as in the case of boilerplate byte sequences, or because it's redundant data that can be reconstructed from compressed signal).

Take for instance a worst case scenario for compression, a long sequence of random uniformly distributed bytes (perhaps as a one time pad). There's no boilerplate to remove, and no redundant data to remove, there is in effect no noise in the data only signal. Your only options for compression would be to construct a lookup index, but if the data is highly uniform it's likely there are no long sequences of repeated bytes. It's highly likely that you can create no index that would save any significant amount of space. This is in effect nearly impossible to compress.

Modern compression relies on the fact that most data formats are in fact highly predictable with lots of trimmable noise by way of redundant boilerplate, and common often repeated sequences, or in the case of lossy encodings even signal that can be discarded in favor of approximations that are largely indistinguishable from the original.

Waldowal ,
@Waldowal@lemmy.world avatar

I'm no expert in this subject either, but a theoretical limit could be beyond 200x - depending on the data.

For example, a basic compression approach is to use a lookup table that allows you to map large values to smaller lookup ids. So, if the possible data only contains 2 values: One consisting of 10,000 letter 'a's. The other is 10,000 letter 'b's. We can map the first to number 1 and the second to number 2. With this lookup in place, a compressed value of "12211" would uncompress to 50,000 characters. A 10,000x compression ratio. Extrapolate that example out and there is no theoretical maximum to the compression ratio.

But that's when the data set is known and small. As the complexity grows, it does seem logical that a maximum limit would be introduced.

So, it might be possible to achieve 200x compression, but only if the complexity of the data set is below some threshold I'm not smart enough to calculate.

QuadratureSurfer , (edited )
@QuadratureSurfer@lemmy.world avatar

You also have to keep in mind that, the more you compress something, the more processing power you're going to need.

Whatever compression algorithm that is proposed will also need to be able to handle the data in real-time and at low-power.

But you are correct that compression beyond 200x is absolutely achievable.

A more visual example of compression could be something like one of the Stable Diffusion AI/ML models. The model may only be a few Gigabytes, but you could generate an insane amount of images that go well beyond that initial model size. And as long as someone else is using the same model/input/seed they can also generate the exact same image as someone else.
So instead of having to transmit the entire 4k image itself, you just have to tell them the prompt, along with a few variables (the seed, the CFG Scale, the # of steps, etc) and they can generate the entire 4k image on their own machine that looks exactly the same as the one you generated on your machine.

So basically, for only a few bits about a kilobyte, you can get 20+MB worth of data transmitted in this way. The drawback is that you need a powerful computer and a lot of energy to regenerate those images, which brings us back to the problem of making this data conveyed in real-time while using low-power.

Edit:

Tap for some quick napkin math

For transmitting the information to generate that image, you would need about 1KB to allow for 1k characters in the prompt (if you really even need that),
then about 2 bytes for the height,
2 for the width,
8 bytes for the seed,
less than a byte for the CFG and the Steps (but we'll just round up to 2 bytes).
Then, you would want something better than just a parity bit for ensuring the message is transmitted correctly, so let's throw on a 32 or 64 byte hash at the end...
That still only puts us a little over 1KB (1078Bytes)...
So for generating a 4k image (.PNG file) we get ~24MB worth of lossless decompression.
That's 24,000,000 Bytes which gives us roughly a compression of about 20,000x
But of course, that's still going to take time to decompress as well as a decent spike in power consumption for about 30-60+ seconds (depending on hardware) which is far from anything "real-time".
Of course you could also be generating 8k images instead of 4k images... I'm not really stressing this idea to it's full potential by any means.

So in the end you get compression at a factor of more than 20,000x for using a method like this, but it won't be for low power or anywhere near "real-time".

Cosmicomical ,

just have to tell them the prompt, along with a few variables

Before you can do that, you have to spend hours of computation to figure out a prompt and a set of variables that perfectly match the picture you want to transmit.

QuadratureSurfer ,
@QuadratureSurfer@lemmy.world avatar

Sure, but this is just a more visual example of how compression using an ML model can work.

The time you spend reworking the prompt, or tweaking the steps/cfg/etc. is outside of the scope of this example.

And if we're really talking about creating a good pic it helps to use tools like control net/inpainting/etc... which could still be communicated to the receiving machine, but then you're starting to lose out on some of the compression by a factor of about 1KB for every additional additional time you need to run the model to get the correct picture.

Cosmicomical ,

You are removing the most computationally intensive part of the process in your example, that's making it sound easy, while adding it back shows that your process is not practical.

QuadratureSurfer ,
@QuadratureSurfer@lemmy.world avatar

The first thing I said was, "the more you compress something, the more processing power you're going to need [to decompress it]"

I'm not removing the most computationally expensive part by any means and you are misunderstanding the process if you think that.

That's why I specified:

The drawback is that you need a powerful computer and a lot of energy to regenerate those images, which brings us back to the problem of making this data conveyed in real-time while using low-power.

And again

But of course, that's still going to take time to decompress as well as a decent spike in power consumption for about 30-60+ seconds (depending on hardware)

Those 30-60+ second estimates are based on someone using an RTX 4090, the top end Consumer grade GPU of today. They could speed up the process by having multiple GPUs or even enterprise grade equipment, but that's why I mentioned that this depends on hardware.

So, yes, this very specific example is not practical for Neuralink (I even said as much in my original example), but this example still works very well for explaining a method that can allow you a compression rate of over 20,000x.

Yes you need power, energy, and time to generate the original image, and yes you need power, energy, and time to regenerate it on a different computer. But to transmit the information needed to regenerate that image you only need to convey a tiny message.

Cocodapuf ,

Neurons work in analogue data, I'm not sure lossless algorithms are necessary.

SharkAttak , to Technology in Neuralink looks to the public to solve a seemingly impossible problem
@SharkAttak@kbin.social avatar

Why should we? What's in it for us?

AngryCommieKender ,

You can have a free "flamethrower" cigarette lighter. The company is bankrupt, and musk has a warehouse if the things he didn't sell.

Gsus4 ,
@Gsus4@mander.xyz avatar

Nothing, but then you could patent it and license it to anyone but elon :) are you motivated yet?

QuadratureSurfer ,
@QuadratureSurfer@lemmy.world avatar

A job interview! (I wish I was joking).

The reward for developing this miraculous leap forward in technology? A job interview, according to Neuralink employee Bliss Chapman. There is no mention of monetary compensation on the web page.

BobGnarley ,

I mean damn bro helping humans potentially walk again is a pretty big "for us" thing if you think about it in terms of humankind and not just yourself. Like imagine if someone were trying to cure cancer with the help of the public and you're all like "well what the fuck is in it for ME though?"

cestvrai ,

Imagine we all pooled our resources to fund medical research through taxes only for private companies to exploit the technology and jack up the prices…

A brain implant for rich people isn’t necessarily “for us”.

SharkAttak ,
@SharkAttak@kbin.social avatar

Oh but I'm not saying this out of selfishness, the problem for me is not the cancer cure in itself, but who is doing the research..

  • the experiments on monkeys were questionable in method and nature, and led to death and madness;
  • the other chip installed in a human has already lost the majority of connection wires;
    and not to forget, it's not been specified how the public giving the ideas, would benefit from it. Musk is not exactly known as the phylanthropic kind.
drdiddlybadger , to Technology in Neuralink looks to the public to solve a seemingly impossible problem
@drdiddlybadger@pawb.social avatar

That isn't at all their problem their problem is scar tissue buildup that they haven't even bothered addressing. Wtf are they doing talking about data compression when they can't even maintain connection.

Modern_medicine_isnt ,

Cause there are always more patients... but more data will let them get more press when it enables more interesting demos.

Cocodapuf ,

You really think they only have one problem to solve? If that were the case this would be relatively easy.

BarbecueCowboy ,

There were rumors of that and a lot of other complications in the animal trials. I don't think we ever got proof, but a lot of irregularities that were explained away. Could be a lot more problems coming.

Evotech , to Technology in Neuralink looks to the public to solve a seemingly impossible problem

Did they try middle out compression?

AA5B , to Technology in Neuralink looks to the public to solve a seemingly impossible problem

Already solved by evolution. This is the same problem as all of us have with visual data. We’ve evolved to need much less data transfer by doing some image processing first. Same deal. Stick some processors in there so you only need to transfer processed results, not raw data

RainfallSonata , to Technology in SUV stolen from Toronto driveway shows up 50 days later — AirTags tracked vehicle from Canada to Middle East, offering glimpse into shipping routes used by car thieves

Who does he expect is going to pay to ship it back to him?

variants ,

I guess the insurance company?

RainfallSonata ,

Would they, though? Wouldn't they just write it off? I mean, I have no idea what it costs to ship a vehicle from the middle east, but if it involves cargo ships and freight trains, would they bother? Has he got to arrange and pay for it himself and get reimbursed? Just buy a new car. Trying to get it back at this point is over the top. Maybe don't buy a douchemobile next time.

barsoap ,

Brand-new cars get shipped all the time over oceans before showing up in showrooms, it's not that expensive.

OTOH it might be cheaper to sell the car in the middle east and buy a used one on the continent it's produced on. On yet another hand the insurance might just say "we don't want to deal with this shit" and pay out: Even figuring out the legalities, paying agents in multiple countries etc. might be more expensive.

Quill0 ,
@Quill0@lemmy.digitalfall.net avatar

That is done in bulk however. Shipping one car is expensive

CADmonkey ,

Something tells me stolen cars are shipped in bulk.

ikidd ,
@ikidd@lemmy.world avatar

They'd absolutely write it off, the shipping would pale in comparison to the effort to clear all the foreign paperwork and then get it back into Canada. And as the owner, you'd want it replaced because you have no idea what's been done to it in the several months it would take to get it back.

Pyr_Pressure ,

Apparently the shipping wasn't too much for the thieves. Wonder why they bother stealing a vehicle from Canada and not somewhere closer to where it was going to end up.

dubyakay ,

The Canadian government subsidizes their shipping at the expense of tax payers.

ikidd ,
@ikidd@lemmy.world avatar

Maybe because the truck is free for the thieves and costs full price for the insurance company? And if you want to pick peaches, you go where the peaches are, then drive all the way back. There are very few Yukon or Yukon size vehicles outside of NA.

Nomecks , to Technology in Neuralink looks to the public to solve a seemingly impossible problem

Listen Elon, I have three words that will blow your mind: Middle out compression!

AbidanYre ,

That's a lot more civil than the three words I have for him.

Rentlar , to Technology in Canadian Surpreme Court Rules Police Now Need a Warrant to Get a Person's IP

The ruling said the privacy interests cannot be limited to what the IP address can reveal on its own "without consideration of what it can reveal in combination with other available information, particularly from third-party websites."

It went on to say that because an IP address unlocks a user's identity, it comes with a reasonable expectation of privacy and is therefore protected by the Charter.

Personally I agree with the majority opinion here. "For the safety of children and crime victims" is too often used as an excuse to unleash wide-reaching attacks on privacy.

Police will still be able to obtain the information they need when the cases involving children and victims of crime happen, they just need to get permission from the courts. This ruling seems to prevent law enforcement from doing an internet analogue of "carding", requesting and obtaining random Canadian IPs in search of something to prosecute.

boatsnhos931 , to Technology in No more Pornhub? That will depend on what happens with a Senate bill

Let's start blocking websites in our country because it might be inappropriate for the children. Yes I understand that it is very easy to get around but ask yourself how many pornographic websites are out there...Who is going to decide which sites or content is ok? AI? LOL
https://lemmy.world/pictrs/image/87c900db-d72d-41bb-8c89-5856cd074a1c.jpeg

n3m ,
@n3m@lemmy.electrospek.com avatar

This is what parental controls are for. Why is the government trying to be every kids parent? This is definition of big brother. Say it's for the kids to get people on board then use that info for other means.

I don't care about porn sites personally but to what end will this take us?

boatsnhos931 ,

I will snap in this motherfucker if I can't look at femboys with big tiddies

  • All
  • Subscribed
  • Moderated
  • Favorites
  • random
  • incremental_games
  • meta
  • All magazines