As someone who is just about to finish a degree in games programming:
Study CS and follow tutorials online to learn gaming tools. I've spent the last few years of my life learning unity's C# and other tools around it just to start looking for jobs to realize, firstly, my uni completely fucked me by not teaching unreal (the uni and epic games HQ are practically on the same fucking bus line), and on top games employers are looking for experienced devs almost exclusively. And of those, half of them are going to be an insulting pay cut, and the rest are going to be a soulless SaaS call of duty or fortnite model. Working in games isn't very worth it unless you can get hired by a AAA studio and love their game too. Probably best to find a standard dev job and make a game on your own time as a passion project.
Oh, also non-compete clauses are going to mean if you work for AAA, you immediately can't make your own stuff anymore either.
Probably best to find a standard dev job and make a game on your own time as a passion project.
I watch twitch streamers who make games, and this seems to be the way to go. I can't really judge through a screen, but they seem happy and excited to work on their stuff.
Oh, also non-compete clauses are going to mean if you work for AAA, you immediately can't make your own stuff anymore either
Depending on your jurisdiction, these can have various degrees of enforceability. A quick look at the wikipedia page for them tells me they are mostly void in California. Although I suppose no one wants to get into a legal battle they can avoid.
This seems like you're introducing selection bias. Do you really want laypersons to influence your code? You should poll a representative sample of coders for a random number, and use the mode of that.
That's a very long list of different techniques with examples, very cool!
Though I wonder, is there some connection to image processing, high dynamic range?
Or audio compression, the kind which brings out the kick in the mix, not the kind which saves disk space?
The similarity I see between all three fields is, they try to bring down extreme values, outliers, to level the field, while still retaining characteristics.
I didn't know about that [under this name], so thanks for bringing it up. But no, I meant something slightly different.
Colors of noise describes how to generate different distributions. What I meant was how to transform distributions.
Many of the examples in the article start with a random number distribution, and then transform it to reduce discrepancy.
This reminded me of audio/video signal processing. For example, one can take a picture and transform it to reduce discrepancy (so that neither very bright parts nor very dark parts overshoot). Or you can take an audio sample and transform it to reduce discrepancy in loudness.
So the idea was that maybe techniques of either field (RNG, audio, video) could be applied to both other fields.
Object pooling is an absolute necessity for performance in modern environments that remove all manual memory management in favour of automatic garbage collection. And it's still good practice to reuse memory when you do have some manual control.
Not many things will slow your program down (or make a garbage collector blow up) as effectively as alternately freeing and requesting tiny chunks of memory from the OS thousands of times a second.
Honestly this is usually bad advice nowadays, for a bunch of reasons:
Modern allocators do the same thing as object pooling internally, usually faster. They rarely interact with the OS.
A GC will do things like zero old memory on another thread, unlike a custom clearing function in a scripting language.
Object pooling breaks the generational hypothesis that most modern garbage collectors are designed around; it can actually make performance much worse. Most GCs love short-lived objects.
Object pools increase code complexity and are very error prone; when you add a new field you have to remember to clear it in the pool.
If you are in a non-GC language you probably want something "data-oriented" like a slotmap, not a pool of object allocations.
Having said all that, it still all depends on the language/VM you're targeting. The guy in the video clearly benchmarked his use case.
As someone who inevitably gets thrown into the "devops" side and the like:
The vast majority of developers can't integrate code or even resolve a merge conflict (and god help you if someone convinced the team to do rebasing instead...). They just stop working and then whine three weeks later during a standup that progress is halted on their deliverables. And, because of the stupidity of "devops" as a job role, there is an increasing culture that development should not have to worry about this kind of stuff.
So good project management becomes splitting up files to minimize the chance for conflicts and spreading tasks out to minimize the times people will be in the same file, let alone function. And if they do? Then you do whatever the latest buzz word is for "peer programming".
I will never understand the idea that rebasing inherently causes problems. Rebasing gives a much cleaner history and reduces the number or commits with multiple parents, making it approximate a simple tree rather than a more complex graph.
The simple rule is branches that only you work on can be rebased, shared branches must be merged.
I've never understood the complaints about rebasing. Just make sure you merge if it is complicated
Jokes aside: It honestly isn't THAT much worse. But if you don't "understand" git, you can fuck up your history and it is a real mess to recover from a "failed but technically not" rebase. Whereas merges just result in a shitfest of a history but everything gets reconciled.
Although, a bit of a rant: I do still think rebasing is fundamentally "bad" for long term debugging. As a simple example, let's say that you changed a function signature on branch A and merged it in. Branch B uses that function and started before A. After a rebase, there is no indication that those previous commits would not have worked or were counting on a different signature.
Generally speaking, you can avoid this so long as you always add a merge commit to go with the pull requests (or futz around a bit to identify the known good commits). You assume that those are the only valid commits and move from there. But when you are debugging a bug that silently got added two years ago and think you are clever because you know how git bisect works? You suddenly have a lot of commits that used to work but don't anymore.
It doesn't come up often (especially since so many workflows these days are "throw out the old code and redo it, but right this time") but you only need to run into that mess once or twice to no longer care about how clean your history is.
This feels like a problem I just had a complex enough code base to worry about. I like rebasing because it feels more like I am committing when I intended, but if the deltas were too great it would be a huge issue.
The small more frequent changes not solve this too?
If your project/code base suits itself well to being nothing but small feature branches, sure.
But reality is that you are going to have the "long living feature" branches where it doesn't really make sense to merge any of the code in until "it all works"
The “long lived feature branch” approach is kind of the entire problem that leads to merge hell though. Having long lived branches is at odds with the rebase centric style and that’s intentional. Rebasing incentivises keeping branches small and getting stuff into main as often as possible. Basically it’s about using git in a “trunk” style.
The next question is “what if I don’t want this code to go live yet” to which the usual answer is “feature toggles”
People get very dogmatic about this stuff haha. I’ve worked in teams that were very hype about using rebasing and teams that could easily handle long lived feature branches. The difference is the latter kind of team weren’t all trying to edit the same files at the same time. Hmm. So yeah I guess what works best is situational!
EDIT: I just realised this is a gamedev community whereas the above comment is based on my experience in the “enterprise business factory” dev world. Might be a bit different over here, not sure!
Pure ideologies work until you have tight deliverables. And "feature toggles" become problematic if you need to do a "release" and don't want to have undocumented (possibly unfinished) functionality that is one malformed configuration file away.
At the end of the day, it is about balancing "clean" development with acknowledging thjat you are going to need to cut corners. Generally speaking, "open source" projects can get away with a strong focus on ideology because they don't have deliverables that can mean the difference between being rockstars and being this week's tech layoffs.
Our last major college project that spanned multiple semesters was worked on by 5 devs all editing the same source files over Dropbox. The school had servers for svn, but no one knew how to do source control. It was exactly the type of shitshow you would expect.
I love this so much :D That reads like something I'd expect from ZA/UM, but it also thankfully alleviates most of the major issues I had with the game, which I've already talked about here on Lemmy. I really liked the game, but there was a lot of red flags point to it being just a quick corporate cash grab, where they decided to basically re-skin heir previous game based on with as low effort as possible, to quickly sell it and cash in on the Pokemon thing. It just smelled with corporate greed, and that they did not really cared about the game too much.
But assuming this screenshot is true, I'd say that it's clear that it wasn't development driven and pushed by corporate greed, but really just a few of guys trying their best.
As a craftopia player palworld definitely feels like a bit of a reskin, but one that gives players a lot of what they wanted (mainly being able to explore freely in multiplayer mode which is severely limited in craftopia).
One element palworld leaves out is being able to create your own automated processes (like automating a farm with a series of conveyer belts, chests, and various machines). They say they’re still planning to develop craftopia so I am pretty excited to get the elegance of the pal world pets (which craftopia had too, but not as shiny) and the fun of automating your own homestead instead of setting up prefab stations.
I guess I'm a little confused, because wasn't their previous game Craftopia? I'm fairly certain that game sold relatively well for an indie game, at least 25,000+ copies if you base it off the All Time Peak Players on Steam Charts. For a small team of just a few people, that's a decent chunk of money (I think it sold at $30 around launch, so roughly $750k for 25k copies sold before Steam takes its cut). Craftopia came out in 2020, so they're saying they've learned virtually nothing in 4 years, not counting the dev time Craftopia had?
I enjoy Palworld, I think it's a fun game that has a lot of potential, but I'm not sure I'm fully buying into some of these responses.
As a side note, a lot of Craftopia people complain they abandoned that game, but looking on Steam it shows several recent updates across the last year, with one even coming out just yesterday and a huge one in November 2023 and a new roadmap posted in December 2023. So, I'm not really sure where those players are coming from regarding that.
Regardless, I'm looking forward to what Palworld grows into, as I really do think they have something rather special here. It's got a lot of rough edges and a couple core design problems, but those can eventually be addressed with some hard work. Hopefully they use the massive cash influx they've achieved with their recent success and hire some competent, seasoned developers to come in and get their shit in order. I'm not holding my breath too much, though (remember when we thought Valheim devs would spend their game success lottery money to massively boost that game's content and polish?).
Agreed on both points - I am skeptical they are such "amateurs", and it also doesn't necessarily seem like a cash grab considering how Craftopia is faring. It does seem like there's way too much buzz currently going on however, it's hard to say what is true or what is just an outright lie... hoping to learn more and see how this progresses in a few months. Also hoping they stay dedicated to improving palworld more than the valheim devs did (meaning barely anything, thank god for the modding community there doing what little they can to keep the game alive).
Idk craftopia has a pretty committed fan base and combined with being free on gamepass I’m not surprised at the buzz. Very curious to know the gamepass player #’s compared to steam. I wouldn’t be surprised if microsoft has done some promotional stuff for free though (to promote gamepass).
It turns out that most info from the screenshot is false, there's a better article that's written by the actual developer linked in the updated post.
He did talk about them not having a budget plan, which was a fairly long part of the article, but can be summed up like this:.
Figuring out budget is too much additional work, and we want to focus on our game. Our budget plan is “as long as our account isn’t zero, and if it reaches zero, we can always just borrow more money, so we don’t need a budget
He also further down mentioned actual numbers of how much went into the development:
Judging from Craftopia's sales, it's [the budget] probably around 1 billion yen...
Because all those sales are gone.
I think the lesson is not that bucket-o-flash-drives is a better way; I think the lesson is that you can make a not-ideal process work completely fine if you just keep focused on the main point. People made successful software way before version control existed. It just makes it easier but that's all it does.
Linux was written up to about version 2.2 or 2.4 or thereabouts with no version control, just diff and patch and email. They invented git because at a certain point they wanted automated tools to make easier and more automated their way of working (which none of the suitable VCSs of the time were capable of), but it wasn't like they couldn't do the job until the tools existed.
Yes. The little word "suitable" was doing a lot of work in my explanation, it's true. BK was invented by a kernel developer for pretty much exactly the reasons I explained; I just glossed over the part of the route that went diff+patch -> BK -> git as a detour. I was actually a lurker on the linux-kernel mailing list while all this was going on and saw the fireworks in real time.
It's not exactly accurate to say BK withdrew support. Larry McVoy was such a pain in the ass about wanting to control how people used his product that the kernel developers felt the need to write their whole own solution as opposed to continue putting up with him. (Specifically, his pulling Andrew Tridgell's license for dubious reasons was the straw that broke the camel's back.) RMS actually wrote a sarcastic open letter thanking McVoy for providing a good lesson in why it's a bad idea to let your important stuff be dependent on proprietary software.
But the point remains; BK was invented by a member of the kernel community specifically because none of the existing solutions were usable for them, after more than 10 years of one of the biggest and most-distributed software projects in the world having no source control whatsoever.
Project zomboid had to start again after flat got burgled and laptop gone. Offsite backups are key for theft and fire. Version control is the easiest and cheapest way.
Someone always knows someone that drinks, smokes and eats crap and lives until mid 90s. Doesn't mean it's good health advice.
Game Development
Oldest
This magazine is from a federated server and may be incomplete. Browse more on the original instance.