Welcome to Incremental Social! Learn more about this project here!
Check out lemmyverse to find more communities to join from here!

@Thrashy@lemmy.world avatar

Thrashy

@Thrashy@lemmy.world

Laboratory planner by day, toddler parent by night, enthusiastic everything-hobbyist in the thirty minutes a day I get to myself.

This profile is from a federated server and may be incomplete. Browse more on the original instance.

Thrashy ,
@Thrashy@lemmy.world avatar

The average American house on a basement will have something like 40 m^3 of concrete in its foundation. If all of it could be utilized, that's still ~12kWhr of storage capacity. Nothing to be sneezed at.

Thrashy , (edited )
@Thrashy@lemmy.world avatar

Any time you see perovskite-based cells mentioned, you can assume for the time being that it's just R&D. Perovskites are cool materials that open up a lot of neat possibilities, like cheaply inkjet-printing PV cells, but they have fundamental durability issues in the real world. When exposed to water, oxygen, and UV light, the perovskite crystals break down fairly rapidly.

That's not to say that the tech can't be made to work -- at least one lab team has developed cells with longevity similar to silicon PVs -- but somebody's going to have to come up with an approach that solves for performance, longevity, and manufacturability all at once, and that hasn't happened yet. I imagine that when they do, that will be front-and-center in the press release, rather than just an efficiency metric.

Thrashy ,
@Thrashy@lemmy.world avatar

This is actually becoming somewhat commonplace. For example, in many cutting-edge cancer therapies, blood is drawn from the patient, processed in tissue-culture suites on site to extract the patient's immune cells and sensitize them to some marker expressed by their specific cancer cells, and then the modified immune cells are returned to the patient room and transfused back into their bodies. It's not cheap per se but it's something that most top-tier cancer centers can do, and to do the similar process of extracting stem cells, inducing them to transform into pancreatic islet cells, and transplanting those into the patient's pancreas isn't that big of a jump -- and it'd be cheaper than a lifetime of insulin in any case. It also points the way towards treating other kinds of organ failure without the risk of rejection, too.

The ugly truth behind ChatGPT: AI is guzzling resources at planet-eating rates (www.theguardian.com)

Despite its name, the infrastructure used by the “cloud” accounts for more global greenhouse emissions than commercial flights. In 2018, for instance, the 5bn YouTube hits for the viral song Despacito used the same amount of energy it would take to heat 40,000 US homes annually....

Thrashy ,
@Thrashy@lemmy.world avatar

Data center cooling towers can be closed- or open-loop, and even operate in a hybrid mode depending on demand and air temps/humidity. Problem is, the places where open-loop evaporative cooling works best are arid, low-humidity regions where water is a scarce resource to start.

On the other hand, several of the FAANGS are building datacenters right now in my area, where we're in the watershed of the largest river in the country, it's regularly humid and rainy, any water used in a given process is either treated and released back into the river, or fairly quickly condenses back out of the atmosphere in the form of rain somewhere a few hundred miles further east (where it will eventually collect back into the same river). The only way that water is "wasted" in this environment has to do with the resources used to treat and distribute it. However, because it's often hot and humid around here, open loop cooling isn't as effective, and it's more common to see closed-loop systems.

Bottom line, though, I think the siting of water-intensive industries in water-poor parts of the country is a governmental failure, first and foremost. States like Arizona in particular have a long history of planning as though they aren't in a dry desert that has to share its only renewable water resource with two other states, and offering utility incentives to potential employers that treat that resource as if it's infinite. A government that was focused on the long-term viability of the state as a place to live rather than on short-term wins that politicians can campaign on wouldn't be making those concessions.

Thrashy ,
@Thrashy@lemmy.world avatar

Problem is that if you're looking for FOSS software outside of the absolute most mainstream use cases, that type of software is the only available option. GIMP and Inkscape have been mentioned but throw FreeCAD into the ring as well. Shotcut and Kdenlive are passable, but don't quite measure up to the commercial alternatives.

My particular hobby horse is CFD code. OpenFOAM is fantastic from a technical standpoint, but until recently, to actually use it you either had to buy a commercial front-end, or literally write C++ header files to set up your cases. There's a heroic Korean developer who's put together a basic but very functional front-end GUI in the last year to change that, but it only covers relatively straightforward cases at the moment.

Thrashy ,
@Thrashy@lemmy.world avatar

It's not a coincidence that Texas is a hotbed of development for "microgrid" systems to cover for when ERCOT shits the bed -- and of course all those systems are made up of diesel and natural gas generator farms, because Texans don't want any of that communist solar power!

I've got family in Texas who love it there for some reason, but there's almost no amount of money you could pay me to move there. Bad enough when I have to work on projects in the state -- contrary to the popular narrative, in my personal opinion it's a worse place than California to try and build something, and that's entirely to do with the personalities that seem to gravitate to positions of power there. I'd much rather slog through the bureaucracy in Cali than tiptoe around a tinpot dictator in the planning department.

Thrashy ,
@Thrashy@lemmy.world avatar

I exaggerate -- but Magic Rock is doing booming business installing strings of natural gas generators at Buc-ee's across the state, and I'm currently dealing with an institutional client who wanted to provide backup power for a satellite campus, and didn't even stop to consider battery-backed PV on the way to asking for a natural gas generator farm.

Thrashy ,
@Thrashy@lemmy.world avatar

Historically AMD has only been able to take the performance crown from Intel when Intel has made serious blunders. In the early 2000s, it was Intel commiting to Netburst in the belief that processors could scale past 5Ghz on their fab processes, if pipelined deeply enough. Instead they got caught out by unexpected quantum effects leading to excessive heat and power leakage, at the same time that AMD produces a very good follow-on to their Athlon XP line of CPUs, in the form of the Athlon 64.

At the time, Intel did resort to dirty tricks to lock AMD out of the prebuilt and server space, for which they ultimately faced antitrust action. But the net effect was that AMD wasn't able to capitalize on their technological edge, Ave ended up having to sell off their fabs for cash, while Intel bought enough time to revise their mobile CPU design into the Core series of desktop processors, and reclaim the technological advantage. Simultaneously AMD was betting the farm on Bulldozer, believing that the time had come to prioritize multithreading over single-core performance (it wasn't time yet).

This is where we enter the doldrums, with AMD repeatedly trying and failing to make the Bulldozer architecture work, while Intel coasted along on marginal updates to the Core 2 architecture for almost a decade. Intel was gonna have to blunder again to change the status quo -- which they did, by betting against EUV for their 10nm fab process. Intel's process leadership stalled and performance hit a wall, while AMD was finally producing a competent architecture in the form of Zen, and then moved ahead of Intel on process when they started manufacturing Zen2 at TSMC.

Right now, with Intel finally getting up to speed with EUV and working on architectural improvements to catch up with AMD (and both needing to bridge the gap to Apple Silicon now) at the same time that AMD is going from strength to strength with Zen revisions, we're in a very interesting time for CPU development. I fear a bit for AMD, as I think the fundamentals are stronger for Intel (stronger data center AI value proposition, graphics group seemingly on the upswing now that they're finally taking it seriously, and still in control of their destiny in terms of fab processes and manufacturing) while AMD is struggling with GPU and AI development and dependent on TSMC, perpetually under threat from mainland China, for process leadership. But there's a lot of strong competition in the space, which hasn't been the case since the days of the Northridge P4 and Athlon XP, and that's exciting.

Thrashy ,
@Thrashy@lemmy.world avatar

The only link I am aware of is that Intel operates an R&D center in Haifa (which, it happens, is responsible for the Pentium M architecture that became the Core series of CPUs that saved Intel's bacon after they bet the farm on Netburst and lost to Athlon 64). Linkerbaan's apparent reinvention of the Protocols of the Elders of Zion to the contrary, the only real link seems to be that Haifa office, which exists to tap into the pool of talented Israeli electronics and semiconductor engineers.

Thrashy ,
@Thrashy@lemmy.world avatar

On the one hand, I agree with you that the expected lifespan of current OLED tech doesn't align with my expectation of monitor life... But on the other hand, I tend to use my monitors until the backlight gives out or some layer or other in the panel stackup shits the bed, and I haven't yet had an LCD make it past the decade mark.

In my opinion OLED is just fine for phone displays and TVs, which aren't expected to be lit 24/7 and don't have lots of fixed UI elements. Between my WFH job and hobby use, though, my PC screens are on about 10 hours a day on average, with the screen displaying one of a handful of programs with fixed, high contrast user interfaces. That's gonna put an OLED panel through the wringer in quite a bit less time than I have become used to using my LCDs, and that's not acceptable to me.

Thrashy ,
@Thrashy@lemmy.world avatar

Through the course of my career I've somehow lost office space as I've ascended the corporate food chain. I had a private office/technician room in my first job out, then had an eight foot cubicle with high walls, then a six foot cubicle with low dividers, and then the pandemic hit. The operations guy at the last place was making noises about a benching arrangement after RTO, like people were going to put up with being elbow to elbow with Chris The Conference Call Yeller and Brenda The Lip Smacking Snacker while Team Loudly Debates Marvel Movie Trivia is yammering away the next row over.

Hell, if it meant getting a space to myself with enough privacy to hear my own thoughts I might consider giving up my current WFH gig. But everybody's obsessed with building awful office hellscapes and I don't have the constitution to put up with that kind of environment.

Thrashy ,
@Thrashy@lemmy.world avatar

Well, that'll happen if you don't take your Neuropozyne. Their test subject should have budgeted for that before getting augmented.

Thrashy ,
@Thrashy@lemmy.world avatar

It's all Broadwell Xeons. Sure, there's 8000 of 'em, but after you factor in purchase price, moving and storage costs, time spent parting out nodes, shipping costs, etc... I think you'd have a hard time breaking even, and for an end user you can get like 4x the FLOPS per socket at half the power consumption with current server CPUs.

Thrashy ,
@Thrashy@lemmy.world avatar

This used to hold broad cultural applicability, back in the Before Times when the "Hitler Did Nothing Wrong" crowd was still excluded from the political mainstream. Norms excluding out-and-proud ethnofascists from official, public participation in the English-speaking political right started to seriously slip around the time of Obama's election and certainly ceased to exist after Trump's win in 2016, but prior to that time "Nazi" was very much more often an ad-hominem attack than an accurate description of somebody's politics.

Thrashy ,
@Thrashy@lemmy.world avatar

My first attempt to switch to Linux for my primary desktop was in 2007, and ended when my attempt to run WoW via WINE mostly worked, but had a weird an completely unfixable audio delay.

Proton (and Valve's efforts on SteamOS and the Steam Deck more generally) have been an absolute godsend for Linux as a usable daily-driver.

Thrashy ,
@Thrashy@lemmy.world avatar

Frankly, given the conflicting priorities and attitudes of the two primary Lemmy devs compared to the needs of instance admins, I'd rather the better-funded instances pooled some of their excess and funded an independent contributor to work on mod tools, GDPR issues, and other things that operators are concerned about that have been backburnered by the current devs.

Thrashy ,
@Thrashy@lemmy.world avatar

Bryan Cantrill's rant about Oracle and Larry Ellison comes to mind. "Do not fall into the trap of anthropomorphizing Larry Ellison."

Thrashy ,
@Thrashy@lemmy.world avatar

I'm a bit squeamish, so I arranged myself so as to be seated basically next to my wife's head, facing the wall, and was laser focused on holding her hand and maintaining eye contact with her.

Meanwhile, the delivering doctor was narrating a play by play as our kid went from just barely crowning to head fully out in three contractions, and then she just had to maneuver his shoulder free and he popped out on the fourth push. Three random things I will never forget from that night:

  • The doctor seeing the umbilical cord and announcing "That's a man that likes to eat!"
  • The doctor further complimenting my wife that she "rocked that thing out like it was her job"
  • One of the nurses looking into the hazmat bucket they'd packed the placenta into and muttering "Jesus Christ..."

Overall, 10/10, never doing it ever again.

Thrashy ,
@Thrashy@lemmy.world avatar

I used to know a poli-sci researcher who was trying to take a big-data look at the success and failure of revolutions, taking in variables like "how many demonstrators rallied against the government?" "How many dissidents were disappeared by internal security forces?" and even things like "how many bullet holes are there on the buildings around the main protest venue in the capital?"

I asked him once if he'd discovered the secret to a successful revolution, and he just grimaced at me.

Thrashy ,
@Thrashy@lemmy.world avatar

What's a little Third Reich here or Reign of Terror there between friends, eh? Besides , it's not like a little bit of anti-intellectual purging or nationwide famine isn't worth enduring to get to a better world for the people left afterwards!

Thrashy , (edited )
@Thrashy@lemmy.world avatar

It's a damn shame, too, because the commercial software in the sector is abusively overpriced, and there's just nothing to be done about it (unless somebody can get antitrust regulators to pay attention, which hasn't happened yet, and I'm not holding my breath for it).

It's not like the FOSS options out there aren't fundamentally capable of doing the job, either -- it's just that they almost universally seem to have been designed by people who think of GUIs as a concession to the normies, and don't understand typical or expected design workflows. I'd love to be able to use FreeCAD instead of Fusion for hobby projects, but just creating a sketch in the former is like fighting through molasses compared to the process in Fusion. A bit of focus on UI instead of under the hood features would go along way towards making these programs viable competitors -- look at how Blender's perception changed amongst professionals after it ditched its idiosyncratic pre-2.7 UI, for instance.

Don't even get me started on BIM software... Ridiculous subscription pricing, barely a bug fix to be found, and feature requests ignored for a decade or more! The last release of Revit's headline new feature was (drumroll, please...) A dark UI mode. Good to see Autodesk put my employer's seven-figure subscription payments to good use. 😑

Thrashy , (edited )
@Thrashy@lemmy.world avatar

Problem being, because big tech money has so distorted the economies of the cities it's clustered in, many of these people can only choose between finding another tech job ASAP, moving away from their industry to a lower cost metro with limited job opportunities, or imminent homelessness. Driving a forklift won't pay the rent, and commercial real estate is so absurdly priced that there may not even be a restaurant to wait tables at.

Thrashy ,
@Thrashy@lemmy.world avatar

Joke's on you, creep, I have a toddler. All my algorithm serves up now is Blippi videos, steam locomotives, and construction equipment!

Thrashy ,
@Thrashy@lemmy.world avatar

For some the optionality of it is less important than the notion that if it's performative, you can be bad at it and therefore make yourself an acceptable target for abuse, and besides that the idea that some roles can be restricted to only those with a certain set of physical characteristics is deeply ingrained in many, be that in terms gender, career, or what have you.

Thrashy ,
@Thrashy@lemmy.world avatar

I'm Commander Shepard, and this is my favorite issue of Fornax on the Citadel!

[Thread, post or comment was deleted by the author]

  • Loading...
  • Thrashy ,
    @Thrashy@lemmy.world avatar

    Battletech let you choose your pronouns independently from the gender presentation of your little 2D avatar icon and butthurt GamerGaters review-brigaded the game and harassed the one trans developer on the dev team.

    Great game, though, even if your ops guy's idea of advance Intel is telling you about reinforcements the turn after they open fire on you, the DropShip pilot lands by Braille on your head sometimes, and your shipboard engineer is probably a plant for the cornball tecnho-Illuminati who have been doing a space CIA for the last few hundred years... But I digress.

    Thrashy ,
    @Thrashy@lemmy.world avatar

    The only cat I've had that I've felt okay with letting roam was a stray that came to us declawed, so he was mostly harmless. We still ended up making him an inside cat because we caught him sneaking into the neighbor's house to steal their cat's food and poop in its litterbox.

    Thrashy ,
    @Thrashy@lemmy.world avatar

    I would never do it to a cat, but when this particular one wandered into my then-girlfriend's house one night and decided he lived there, he was already declawed. He never seemed to suffer too badly from it, fortunately.

    Thrashy ,
    @Thrashy@lemmy.world avatar

    Not a lot of coyotes in our neck of the woods, but the little orange moron kept writing checks with the neighbor cats that his disarmed front paws couldn't cash, so he was always coming back with scratches. One of the other reasons we stopped letting him out.

    Thrashy ,
    @Thrashy@lemmy.world avatar

    We actually found out when my wife was over visiting, and he came in through the cat door, locked eyes with her, froze, and slowly backed out of the house. 😅

    Thrashy ,
    @Thrashy@lemmy.world avatar

    USB-A is only rated for 1,500 mating cycles, whereas USB-C is supposed to last for 10,000... Though in reality the ports on my phones seem to start getting too loose to hold the jack in at around 1,000 cycles. Still, it's not a totally unreasonable standard, and any device that hasn't been designed to be thrown away should have ports broken out onto replaceable subassemblies -- as is the case with many business-class laptops, for instance -- so replacing a failed USB-PD port shouldn't be too much trouble.

    Thrashy ,
    @Thrashy@lemmy.world avatar

    I installed KDE Neon on Friday evening and things were going great, everything was testing well, and Saturday game night with the gang went flawlessly, but this morning the VMWare Horizon Linux client spontaneously decided that it didn't want to accept mouse input anymore, so after ten minutes of troubleshooting I gave up and booted back into Windows so that I can be productive today.

    A battle lost, but the war is not over yet.

    Software company RealPage must face tenants’ price-fixing lawsuit over multifamily housing — ‘Landlords knowledge that sensitive information would be used to price each other’s units is circumstant... (www.reuters.com)

    Software company RealPage must face tenants’ price-fixing lawsuit over multifamily housing — ‘Landlords knowledge that sensitive information would be used to price each other’s units is circumstant...::undefined

    Thrashy ,
    @Thrashy@lemmy.world avatar

    "We're not colluding with our competitors to fix prices on rental units, we all use a third party service to collude for us* is such a bullshit legal excuse that I'm shocked they even floated it.

    Thrashy , (edited )
    @Thrashy@lemmy.world avatar

    Greed isn't the problem, per se -- it's that outside of the biggest sites, which could hoover up ad targeting data of hundreds of millions to billions and sell that data through their own internal ad platform -- the model was never viable to begin with. Notice that the enshittification really took off all soon as interest rates jumped? Tech startups have all been floating along on easy money, but now that loans aren't basically free, VC dollars are drying up. Companies that could previously offset their capital burn with yet another round of investment now suddenly need to make money on their own merit, and are finding that they have to cut service to the bone and monetize the bejeezus out of what's left if they have any chance of survival.

  • All
  • Subscribed
  • Moderated
  • Favorites
  • random
  • incremental_games
  • meta
  • All magazines