I cannot handle the fucking irony of that article being on nature, one of the organizations most responsible for fucking it up in the first place. Nature is a peer-reviewed journal that charges people thousands upon thousands of dollars to publish (that's right, charges, not pays), asks peer reviewers to volunteer their time, and then charges rent the very institutions that produced the knowledge rent to access it. It's all upside. Because they're the most prestigious journal (or maybe one of two or three), they can charge rent on that prestige, then leverage it to buy and start other subsidiary journals. Now they have this beast of an academic publishing empire that is a complete fucking mess.
Wow, I never knew about that and it's not just a small fee either. This 2020 article has it at 9,500 Euro/10,300 USD. "Some observers worry Nature's €9500 publishing fee is so high that it threatens to divide authors into two tiers—those at wealthy institutions or with access to funds to pay, and everyone else."
It's already hard enough getting funding in some fields of science without that kind of added expense to put your data out there. Definitely sounds like you're right to call them out.
Yeah, it's grotesque. Doubly so when you consider that it's often public money that funds the research that they get to paywall. I've been really ragging on them lately for their role in the AI hype, too, which you can read about here and here if that sort of thing interests you.
I'm suspicious of this concept of editorial independence. I think it's a smoke screen that lets companies have their cake and eat it too. As far as I'm concerned, whoever cashes the checks also gets the blame, because either ownership means something, in which case the concept exists to obfuscate that, or it doesn't, in which case why is nature buying up other journals?
I daresay if young people could afford a home, a car, a family, and had some disposable income, free time, and any fucking prospect of a satisfactory life then they'd be a lot less depressed.
I don't think social media is particularly good but it's far from the worst problem facing young people today. The "phone bad" crap is just a lazy cop out.
I don't think it's a lazy cop out at all it's recognizing a complex issue that interweaves into the new realities of life for young adults.
What you stated is the lazy cop out, you're dismissing an entire problem space at the wave of a hand without critically thinking about it.
Everything is connected. An example would be heavy social media use being correlated to lower critical thinking capabilities, lower attention span, and more extreme political and emotional swings lead to a population being more manipulable and less cohesive.
Causing them to vote and act against their own interests at the behest of whoever has enough money to influence them though channels they "trust". Thus influencing a degrading social and financial situation.
It seems you are criticizing to the book the author quotes, not the article itself.
"
Two things need to be said after reading The Anxious Generation. First, this book is going to sell a lot of copies, because Jonathan Haidt is telling a scary story about children’s development that many parents are primed to believe. Second, the book’s repeated suggestion that digital technologies are rewiring our children’s brains and causing an epidemic of mental illness is not supported by science. Worse, the bold proposal that social media is to blame might distract us from effectively responding to the real causes of the current mental-health crisis in young people
While you have a point you might consider what little free time young people have is largely spent on social media full of dark patterns and negative feedback loops and/or gaming stuffed with gambling. One does not detract from the other problems you outline.
"Phone bad" holds true as long as these big corporations insist on regulating themselves when all they do is feed people propaganda to keep anything from changing.
This is exactly what I'm talking about when I argue with people who insist that an LLM is super complex and totally is a thinking machine just like us.
It's nowhere near the complexity of the human brain. We are several orders of magnitude more complex than the largest LLMs, and our complexity changes with each pulse of thought.
Back in the early 2000s CERN was able to simulate the brain of a flat worm. Actually simulate the individual neurons firing. A 100% digital representation of a flatworm brain. And it took up an immense amount of processing capacity for a form of life that basic, far more processor intensive than the most advanced AIs we currently have.
Modern AIs don't bother to simulate brains, they do something completely different. So you really can't compare them to anything organic.
far more processor intensive than the most advanced AIs we currently have
This is the second comment I've seen from you where you confidently say something incorrect. Maybe stop trying to be orator of the objective and learn a little more first.
2014, not early 2000s (unless you were talking about the century or something).
OpenWorm project, not CERN.
And it was run on Lego Mindstorm. I am no AI expert, but I am fairly certain that it is not "far more processor intensive than the most advanced AIs we currently have".
Citation needed on that comment of yours. Because I know for a fact that what I said is true. Go look it up.
Maybe you should be a little less sure of your "facts", and listen to what the world has to teach you. It can be marvelous.
I agree, but it isn't so clear cut. Where is the cutoff on complexity required? As it stands, both our brains and most complex AI are pretty much black boxes. It's impossible to say this system we know vanishingly little about is/isn't dundamentally the same as this system we know vanishingly little about, just on a differentscale. The first AGI will likely still have most people saying the same things about it, "it isn't complex enough to approach a human brain." But it doesn't need to equal a brain to still be intelligent.
It's demonstrably several orders of magnitude less complex. That's mathematically clear cut.
Where is the cutoff on complexity required?
Philosophical question without an answer - We do know that it's nowhere near the complexity of the brain.
both our brains and most complex AI are pretty much black boxes.
There are many things we cannot directly interrogate which we can still describe.
It’s impossible to say this system we know vanishingly little about is/isn’t dundamentally the same as this system we know vanishingly little about, just on a differentscale
It's entirely possible to say that because we know the fundamental structures of each, even if we don't map the entirety of eithers complexity. We know they're fundamentally different - Their basic behaviors are fundamentally different. That's what fundamentals are.
The first AGI will likely still have most people saying the same things about it, “it isn’t complex enough to approach a human brain.”
Speculation but entirely possible. We're nowhere near that though. There's nothing even approaching intelligence in LLMs. We've never seen emergent behavior or evidence of an id or ego. There's no ongoing thought processes, no rationality - because that's not what an LLM is. An LLM is a static model of raw text inputs and the statistical association thereof. Any "knowledge" encoded in an LLM exists entirely in the encoding - It cannot and will not ever generate anything that wasn't programmed into it.
It's possible that an LLM might represent a single, tiny, module of AGI in the future. But that module will be no more the AGI itself than you are your cerebellum.
But it doesn’t need to equal a brain to still be intelligent.
LLM'S don't work like the human brain, you are comparing apples to suspension bridges.
The human brain works by the series of interconnected nodes and complex chemical interactions, LLM's work on multi-dimensional search spaces, their brains exist in 15 billion spatial dimensions. Yours doesn't, you can't compare the two and come up with any kind of meaningful comparison. All you can do is challenge it against human level tasks and see how it stacks up. You can't estimate it from complexity.
You're missing half of it. The data cube is just for storing and finding weights. Those weights are then loaded into the nodes of a neural network to do the actual work. The neural network was inspired by actual brains.
I mean you can model a neuronal activation numerically, and in that sense human brains are remarkably similar to hyper dimensional spatial computing devices. They’re arguably higher dimensional since they don’t just integrate over strength of input but physical space and time as well.
except Sci-hub hasn't been adding new papers since 2020. Anna's Archive is a better bet, because they aggregate both sci-hub and libgen, among others. They also make torrents available for data hoarders.
You're right about Sci-Hub because of their Indian lawsuit which is very important to them, but I didn't know that Anna's Archive was a repository of scientific journals. Is it? I know Library Genesis (or LibGen) has a lot of scientific textbooks, but I didn't know it had papers. Does it?
Anyhow, Anna's Archive and LibGen are super awesome too!
Several meta-analyses and systematic reviews converge on the same message. An analysis done in 72 countries shows no consistent or measurable associations between well-being and the roll-out of social media globally. Moreover, findings from the Adolescent Brain Cognitive Development study, the largest long-term study of adolescent brain development in the United States, has found no evidence of drastic changes associated with digital-technology use. Haidt, a social psychologist at New York University, is a gifted storyteller, but his tale is currently one searching for evidence.
Even though everybody seems convinced our attention spans have decreased, there is no conclusive evidence of it and scientists don’t even really think it is useful to talk about attention outside the context of motivation anyways.
Your attention span is fine, you are just too burned out from modern life to invest energy into things that take a lot of sustained focus that aren’t essential to survival.
You also have to be way more picky with what content you choose to engage with because there is sooooooo much more content now and that may look like a “short attention span” when your brain optimizes for tossing out the 95% off fluff to get right to the thing you actually wanted.
Our attention spans are fine, this has been the most boring moral panic ever but that is really all it is.
I can't make sense of bringing this in for this piece.
The headline of this piece is not really a question. Sure, there is a question in it. But it answers the question in the headline. . . .and that answer isn't "no." It's "it's not clear what the cause is."
Blaming teenage mental illness on social media feels to me like the boomers are trying to find a different scapegoat than all the factors caused by their own stupidity, greed and destruction of human habitat.
Odd when we are also reading how studies are showing increased levels of depression and suicide. Which lie do we believe? I'll just go with what I see happening with my own eyes and experience then.
This piece isn't saying there is no increase in depression and suicide. In fact, the whole premise of the article is that by blaming screen time we might be missing the actual cause of the issue (increase in depression and anxiety) and thus doing our children a disservice.
I would suggest that before trying to decide who to believe, you actually listen to their argument and evidence first. Instead of just thinking that your own perception of the world is perfectly objective and not anecdotal.
except Sci-hub hasn’t been adding new papers since 2020. Anna’s Archive is a better bet, because they aggregate both sci-hub and libgen, among others. They also make torrents available for data hoarders. Their torrents total over 600 TB at this point, but include books in addition to articles.
sci-hub and libgen already outputs list of torrents. do they also archive supplementary information? that's where most of actual interesting data is, sometimes it's open source, sometimes it's not. (at least in my field)
It pisses me off that they're calling quantum data transmission quantum entanglement, it's not the same thing and it's misleading as fuck.
Quantum entanglement is about two quantum particles sharing the same state which if implemented somehow would allow for universal communication with no time lag. Sending quantum state communication through fiber optic, while an achievement for distributed quantum computing, is not quantum entanglement!!
Quantum entanglement communications also have fundamental problems that will likely render them effectively unusable. You need a key to decrypt anything you send, and the key has to travel no faster than c. It's impossible to tell the data from the noise without the key. Attempting to read the data or to change the data being sent also collapses the effect, which can only be fixed by bringing the two systems together. In short, you can only send a single packet of data and you can't use it without a key transmitted using traditional methods.
The limit is c because you have to use cables, radio, or other traditional methods to send the key. The data in the entangled pair would also have to be set at the time the two devices are constructed, so that's not super useful. It might be useful for single use authentication, but that's about it.
Don't think of entanglement as being like one object in two spots. Think of it like identical twins. One twin getting a hair cut does nothing to the other twin's hair. Similarly, altering a property of one entangled particle does nothing to the other and actually means they are no longer entangled or identical.
No problem. I was pretty disappointed when I learned all the sci-fi writers were getting it wrong. Though, to be fair, it really should be called something else.
It's interesting reading quotes from that article like: "If you can’t verify what someone else has said at some other point, you’re just trusting to blind faith for artefacts that you can no longer read yourself." and "After you’ve been dead for 100 years, are people going to be able to get access to the things you’ve worked on?"
It reminds me of problems the US military is having with refitting/upgrading old ICBMs. From the 2021 article, "Minuteman III Missiles Are Too Old to Upgrade Anymore, STRATCOM Chief Says": "Where the drawings do exist, "they're like six generations behind the industry standard," he said, adding that there are also no technicians who fully understand them. "They're not alive anymore."
It's sounds like the danger is we'll be able to access the science (or just trust it's true) but in some cases we'll be unable to retrace our steps.
Good Lord, if the US nuclear arsenal is that antiquated, I shudder to think of where the Russians are at. Please don't short-circuit and accidentally launch…
I wonder if the fact that none have actually exploded yet means that we should be reassured that the vast majority wouldn't actually work.
Or, possibly, just have had their components and fuel stripped decades ago and they're just being "maintained" to keep up appearances for higher-ups. That one is definitely true in at least some cases.
Not going out and interacting as freely with people paying direct attention to one another leads to heightened mental issues? Shocking.
I grew up in the 80's and we were super fucking social. Anyone that didnt live it cannot grasp how far we have fallen from what we once had, and we had no idea how good we had it.
Not to mention everything is being recorded to haunt every kid there is.
I feel read bad for modern day kids, my daughter included. An important aspect of humanity has been lost.
Exactly. Sure, we can say it’s not directly related to tech devices, but it’s definitely related to not wandering and having real human connection constantly.
And with the recording of everything - absolutely changes behavior.
There is a wast difference between the internet. That gives you access to information.
And social media with algoriths fine tuned to keep you there as long as possible.
Cameras everywhere is for sure a disaster for anyones sanity and development.
100%. I read my phone a lot. Typically Lemmy and Wall St Journal. If I didn’t have this device I’d be reading paper magazines and newspapers just like I did pre-device / internet.
It’s not the device, it’s how it’s being used that’s harmful. But I think we all agree with that
Yeah, everyone in this thread saying the phone bad is a Boomer cop out is oversimplifying the issue.
Yeah, there’s probably a component of taking the blame away from decreased quality of life by blaming it on phones—but you can’t neglect the effect that lack of social interaction has. I’m from the same era, and it’s overwhelming to think how much more complex everything has gotten.
That's not a phone issue, that's a place issue. Where can your daughter go (without needing to drive) to hangout with friends? Can she conceivably walk there? Can her friends? I've been hearing my entire life that I just need to go outside and Bla Bla Bla but I don't have anywhere to go. The closest park is a good half hour walk and now there's even sidewalks! How pleasant. There's nowhere for children outside, it's nigh impossible to walk anywhere and it's not like your parents would let you anyway since there probably isn't even sidewalks the whole way.
For perspective I live a very reasonable 10 minutes walk away from the elementary school I went to. I think you'll agree that's a reasonable distance for at the very least the older kids to walk. However it took them till I was a senior in high school before they put in the side walk. You literally couldn't get to the elementary school on foot without walking on the side of the road for ~4 minutes. Even now the experience is awful and the crossings are unsafe. This is the world us phone kids grew up in. It's not that we don't want to go hangout in person, there's just nowhere to go and by the time people can drive it's far too late.
Also the high school is about 40 minutes walk, there's even sidewalks the whole way (now (only on one side))! It's an awful experience as there's absolutely no shade and about half of it is down a stroad.
nature.com
Top