I've been messing around with GitLab as a self hosted alternative for a few years. I do like it, but it is resource intensive!
For the past few days I've been playing with Forgejo (from the Codeberg people). It is fantastic.
The biggest difference is memory usage. GitLab is Ruby on Rails and over a dozen services (gitlab itself, then nginx, postgrest, prometheus, etc). Forgejo is written in go and is a single binary.
I have been running GitLab for several years (for my own personal use only!) and it regularly slowly starts to use up the entirety of the RAM on a 16GB VM. I have only been playing with Forgejo for a few days, but I am using only 300MB of the 8 GB of RAM I allocated, and that machine is running both the server and a runner (it is idle but...).
I'm really excited about Forgejo and dumping GitLab. The biggest difference I can see if that Forgejo does not have GraphQL support, but the REST API seems, at first glance, to be fine.
EDIT: I don't really understand the difference between gitea and forgejo. Can anyone explain? I see lots of directories inside the forgejo volume when I run using podman that clearly indicate they are the same under the hood in many ways.
Our product studio with currently around 50 users who need daily git access moved to a self hosted forgejo nearly 2 years ago.
I really can’t overstate the positive effects of this transition. Forgejo is a really straightforward Go service with very manageable mental model for storage and config. It’s been easy and cheap to host and maintain, our team has contributed multiple bugfixes and improvements and we’ve built a lot of internal tooling around forgejo which otherwise would’ve required a much more elaborate (and slow) integration with GitHub.
Our main instance is hosted on premise, so even in the extremely rare event of our internet connection going offline, our development and CI workflows remain unaffected (Forgejo is also a registry/store for most package managers so we also cache our dependencies and docker images).
Just run podman or docker login your.forgejo.instance.address then push to it as normal. An existing repo must exist. You can check the images under site administration -> packages.
Speaking of authentication it also works as an openid provider meaning you can authenticate every other web software that supports it to Forgejo... which in turn can look for users in other sources.
It also has wikis.
Its an underrated piece of software that uses a ridiculous small amount of computer resources.
That's so brilliant. Wow. I'm struggling to wrap my brain around how they not only support OCI (docker) but also APK (alpine) and APT (debian) packages. That's a very cool feature.
Ease of maintenance is an even bigger difference. We've been using gitea for a bit over five years now, and gitlab for a few years before that, and gitea requires no maintenance in comparison. Upgrades come down to pulling the new version and restarting the daemon, and take just a few seconds. It's definitely the best solution for self-hosters who want to spend as little time as possible on their infrastructure.
Backups are handled by zfs snapshots (like every other server).
We've also had at least 10× lower downtime compared to github over the same period of time, and whatever downtime we had was planned and always in the middle of the night. Always funny reading claims here that github has much better uptime than anything self-hosted from people who don't know any better. I usually don't even bother responding anymore.
I guess I'll just chime in that while Gitlab is a very heavy beast, I have self hosted it for over a decade with little to no issues. It's pretty much as simple as installing their Omnibus package repository and doing apt install gitlab-ce.
When I self hosted gitlab I never found the maintenance to be that bad, just change a version in a compose.yml, sometimes having to jump between blessed versions if I've missed a few back to back.
Like others, I've switch to Gitea, but whenever I do visit gitlab I can't help but think the design / UX is so much nicer.
My usual impression of GitLab is that it has too many functions I don't ever use, so the things I actually do want (code, issues, PRs, user permissions) are needlessly hidden. What's your workflow that you find GitLab's UX to be nicer than Gitea's?
For instance I just got tripped trying to sign out of my gitea instance since the mobile design has two identical looking avatar + username blocks on top of each other, one being the org switcher the other being a menu (with no indicator) with the sign out button.
I went to a project page, and it auto focused the search input (???), causing a zoom in on mobile.
I just prefer the design / look + feel of gitlab more than gitea/forejo. It's not really a hot take, gitlab has been around a lot longer and has much more support.
That was my take too. It is a big project with a lot of functionality. But, I never needed all of that functionality, so it just seemed bloated to me. I switched over to Gitea for self-hosted code repositories (non-public repos behind a firewall) a while back and haven't had any issues thus far.
I found gitea's interface to be so unusably bad that i switched to full-fat GitLab.
Gitea refused to do some perfectly sensible action- I think it had something to do with creating a fork of my own repo. Looking online, there's zero technical reason for this, and the explanation given was "this is how GitHub does things". Immediately uninstalled. I'm not here for this level of disrespect.
I found gitea's interface to be so unusably bad that i switched to full-fat GitLab.
Was this Gitea pre-UI redesign or after? 1.23 introduced some major UI overhauls, with additional changes in the following releases. Forejo currently represents the Gitea 1.22 UI, reminiscent of earlier GitHub design.
It's a shame that GitHub won the CI race by sheer force of popularity and it propagates its questionable design decisions. I wish more VCS platforms would base their CI systems on Gitlab, which is much much better than GitHub actions.
A really neat feature is that you can also trigger a job by just submitting a yaml file (with the web interface, the API or the cli) without needing to push a commit for each job. This is neat for infrequent tasks, or for testing CI manifests before committing them.
What exactly is the advantage of running something like GitLab vs what I do which is just a server with SSH and a file system? To create a new repo I do:
Then I just set my remote origin URL to example.com:repos/my-proj.git
The filesystem on example.com is backed up daily. Since I do not need to send myself pull requests for personal projects and track my own TODOs and issues via TODO.md, what exactly am I missing? I have been using GitHub for open source projects and work for years but for projects where I am the only author, why would I need a UI besides git and my code editor of choice?
CI runners would be the main advantage of GitLab over bare Git, I think. Also if you want to show other people your personal project at some point, it may be nice to be able to link to a diff or a historical version of a file that they can see in a browser. Or just a syntax-highlighted file or a rendered Markdown or Jupyter file. Also previous release tarballs.
We at $DAYJOB had an internal git server that was literally what the parent of this comment mentioned (`git init --bare`). It became a little cumbersome, so when I stumbled across forgejo, I was happy to see that importing the existing git repos was a breeze, just had to point the config to look at the existing git storage root and assign groups and permissions via the GUI.
Collaboration and specifically collaboration with non git nerds. That's primarily what made GitHub win the VCS wars back in the day. The pull request model appealed to anyone who didn't want to learn crafting and emailing patches.
Yes, it's the PRs, and there is a misunderstanding I think because the OP and the GP's use-cases are quite different. Self-hosting your own repository on a remote server (and perhaps sharing it with 1 or 2 collaborators) is simple but quite different than running a public open source project that solicits contributions.
Yes, the projects that are emailing patches around generally have a much higher bar then the ones that accept GitHub PRs, but whatever works for a given project I guess
You don’t! Forges are for collaboration outside of the rhythm of git commits. You’re happy to make a new commit every time you have something to add to an issue. With X issues and Y comments a hour, polluting the git timeline with commentary is going to become unhelpful.
Setting up a server with SSH and GitLab is more work than setting up a server with SSH. Dropbox is great and I use it but only because I can’t get the same functionality out of rsync without major additional orchestration. But if I am the only one working on my own project why would I need a second read-only UI for my own code?
If you're working alone you can also send raw IP packets down the wire by way of telegraph key if you'd like. What you do alone behind closed doors isn't really anyone's business and is up to you. For everyone else, the benefit of using Gitlab is that once it's set up, a wide range of users of varying skill levels and backgrounds can use it to collaborate.
I wish I could search in Gitlab like I could in Jira's SQL-esque query syntax, but otherwise its interface is a step-up, if still pretty "busy" for my taste.
If you want even more minimal, Gerrit is structured as a Java app with no external dependencies like databases, and stores all it's configuration and runtime information on the filesystem, mostly as data structures in the git repos.
Shared filesystems is all you need to scale/replicate it, and it also makes the backup process quite simple.
I might be one of the few that is intrigued by this being that it’s Java but this looks really neat. Does it do git repositories like gitea, GitHub, etc, or is it more of a project management site for the repositories? They describe it as “code review”, so I wasn’t sure.
I’m a little put off on the google connection but it seems like it could run rather independently.
It's hyper-focused on code review and CI integration, which it does really well.
It's not focused on all the other stuff that people think of in code forges (hosting the README in a pretty way, arbitrary page hosting, wiki, bug tracking, etc.) but can be integrated with 3rd party implementations of those fairly trivially.
> I’m a little put off on the google connection but it seems like it could run rather independently.
Yeah, its actually a really healthy open-source project, google contributes usually around 40% of the code, but you have other companies like GerritForge(disclaimer, I work here), Nvidia, SAP, Qualcomm, Wikimedia foundation, all contributing heavily to it.
Coming from Github myself, I cannot imagine going back to it after using Gerrit for even just a few days.
The workflow in Gerrit really makes a lot of sense, unfortunately its the workflow in GitHub that has screwed up everyone's idea of what code review should look like[1], even by one of GitHub's co-founder own's admission.
I personally find the rebase and stacking commit focused method of integration that Gerrit uses to be easier and cleaner than PR's in GitHub.
Having done CI integrations with both, Gerrit's APIs send pre- and post-merge events through the same channel, instead of needing multiple separate listeners like GitHub.
We've been looking at Forgejo too. Do you have any experience with Forgejo Actions you can share? That is one thing we are looking at with a little trepidation.
I setup actions yesterday. There are a few tiny rough edges, but it is definitely working for me. I'm using it to build my hugo blog which "sprinklylls" in a Svelte app, so it needs to have nodejs + hugo and a custom orchestrator written in Zig.
What I did:
* used a custom docker image on my own registry domain with hugo/nodejs and my custom zig app
* no problems
* store artifacts
* required using a different artifact "uses" v3 instead of v4 (uses: actions/upload-artifact@v3)
* An example of how there are some subtle differences between GitHub Actions, but IMHO, this is a step forward because GitLab CI YAML is totally different
* can't browse the artifacts like I can on gitlab, only allows download of the zip. Not a big deal, but nice to verify without littering my Downloads folder.
* Unable to use "forgejo-runner exec" which I use extensively to test whether a workflow is correct before pushing
* Strange error: "Error: Open(/home/runner/.cache/actcache/bolt.db): timeout"
* I think GitLab broke this feature recently as well!
* Getting the runner to work with podman and as a service was a little tricky (but now works)
* Mostly because of the way the docker socket is not created by default on podman
* And the docker_host path is different inside the runner config file.
* There are two config files, one (JSON) is always stored in .runner and contains the auth information and IP, and the other is YAML and runner needs the -c switch to specify it, and has the config of the runner (docker options, etc). It's a bit strange there are two files IMHO.
This will occur if you have a `forgejo-runner daemon` running while you try to use `exec` -- both are trying to open the cache database, and only the first to open it can operate. You could avoid this by changing the cache directory of the daemon by changing `cache.dir` in the config file, or run the two processes as different users.
> It's a bit strange there are two files IMHO.
The `.runner` file isn't a config file, it's a state file -- not intended for user editing. But yes, it's a bit odd.
We use them in our shop. It's quite straightforward if you're already familiar with Github Actions. The Forgejo runner is tiny and you can build it even on unsupported platforms (https://code.forgejo.org/forgejo/runner) e.g. we've setup our CI to also run on Macs (by https://www.oakhost.net) for App Store related builds. It's really quite a joy :)
Are you building MacOS apps? More specifically, are you doing code signing and notarization and stamping within CI? If so, is this written up somewhere? I really struggled with getting that working on GitLab. I did have it working, but was always searching for alternatives.
One concern the post brings up - single point of failure. Yes, in this case, blah blah big company microsoft blah blah (I don't disagree, but..). I'm more worried about places like Paypal/Google/etc banning than the beast from Redmond.
Self hosting, it's still a single point of failure and the article arguing "mirroring", well... it allows redundancy with reads but writes?
Redundancy for read access to the source code is a concern for Dillo. Some years ago, the domain name registration lapsed, and was promptly bought by an impersonator, taking the official repository offline. If it hadn't been for people having clones of the repository, the source code and history would have been lost.
How do people find your online project and know it's you (instead of an impersonator) without relying on an authority, like GitHub accounts or domain names? It is a challenging problem with no good solution. At least now the project is alive again and more resilient than before.
I think it’s a fair concern, e.g. forgejo is a simple directory on disk, with an option to make that into an S3 storage. It really is a no brainer to set that up for as much resilience as necessary with various degrees of “advanced” depending on your thread model and experience. The lack of a FAANG/M in the equation makes it even more palatable.
I found the banning comment to be odd. That said, all it really takes is a policy change (something that I see as far more likely in Microsoft's case) or simply a change in the underlying software (again, somewhat likely with Microsoft) for the platform to become unusable for them. Keep in mind that Dillo is a browser for those who can't on don't want to fit into the reality of the modern web.
I used, administered, setup, and customized many on prem gitlab instances for years. I gitlab doesn't memory leak, you're making that up. It's exactly as resource intensive as the number of resources you setup. Can't say the same for JIRA et al.
This comment makes me suspect this entire thread as some astroturfing for that other product.
> To avoid this problem, I created my own bug tracker software, buggy, which is a very simple C tool that parses plain Markdown files and creates a single HTML page for each bug.
> GitHub seems to encourage a "push model" in which you are notified when a new event occurs in your project(s), but I don't want to work with that model. Instead, I prefer it to work as a "pull model", so I only get updates when I specifically look for them.
I agree with the sentiment, but want to point out that email can be used to turn push into pull, by auto-filtering the respective email notifications into a separate dedicated email folder, which you can choose to only look at when you want.
> Additionally, GitHub seems to encourage a "push model" in which you are notified when a new event occurs in your project(s), but I don't want to work with that model. Instead, I prefer it to work as a "pull model", so I only get updates when I specifically look for them. This model would also allow me to easily work offline. Unfortunately, I see that the same push model has been copied to alternative forges.
Someone kind enough to explain this to me? What's the difference between push model and pull model? What about push model makes it difficult to work offline?
I would love to see more projects use git-bug, which works very well for offline collaboration. All bug tracker info is stored in the repo itself. https://github.com/git-bug/git-bug
It still needs work to match the capabilities of most source forges, but for small closed teams it already works very well.
Reminder that POP and IMAP are protocols, and nothing stops a code forge—or any other website—from exposing the internal messaging/notification system to users as a service on the standard IMAP ports; no one is ever required to set up a bridge/relay that sends outgoing messages to, say, the user's Fastmail/Runbox/Proton/whatever inbox. You can just let the user point their IMAP client to _your_ servers, authenticate with their username and password, and fetch the contents of notifications that way. You don't have to implement server-to-server federation typically associated with email (for incoming messages), and you don't have to worry about deliverability for outgoing mail.
All of this makes sense. Thank you for explaining. I don't think I understand the difference though.
Like are they calling the "GitHub pull request" workflow as the push model? What is "push" about it though? I can download all the pull request patches to my local and work offline, can't I?
GitHub pull request pushes you a notification/e-mail to handle the merge, and you have to handle the pull request mostly online.
I don't know how you can download the pull request as a set of patches and work offline, but you have to open a branch, merge the PR to that branch, test the things and merge that branch to relevant one.
Or you have to download the forked repository, do your tests to see the change is relevant/stable whatnot and if it works, you can then merge the PR.
---
edit: Looks like you can get the PR as a patch or diff, and is trivial, but you have to be online again to get it that way. So, getting your mails from your box is not enough, you have to get every PR as a diff, with a tool or manually. Then you have to organize them. e-mails are much more unified and simple way to handle all this.
---
In either case, reviewing the changes is not possible when you're offline, plus the pings of the PRs is distracting, if your project is popular.
Seems like you found it, but for others: one of the easiest ways to get a PR's diff/patch is to just put .diff or .patch at the end of its URL. I use this all the time!
It’s bonkers to me that there isn’t a link to the plan patch from the page. Yes, it’s trivial to add a suffix once you know, but lots of people don’t—as evidenced by this thread.
Discoverability in UX seems to have completely died.
You could set up a script that lives in the cloud (so you don't have to), receives PRs through webhooks, fetches any associated diff, and stores them in S3 for you to download later.
Maybe another script to download them all at once, and apply each diff to its own own branch automatically.
Almost everything about git and github/gitlab/etc. can be scripted. You don't have to do anything on their website if you're willing to pipe some text around the old way.
I would say it is time/life management: push tells you to do something now. In pull I check each Friday afternoon what's up in my hobby project and work on it for a few hours and then call it a day and be uninterrupted till next week.
We are in the disapora phase; there is a steady stream of these announcements, each with a different GitHub alternative. I speculate that within a few months, the communities will have settled on a single dominant one. I'm curious if it will be one of the existing ones, or something new. Perhaps a well-known company or individual will announce one; it will have good marketing, and dominate.
This has been going on for a decade, at the beginning it was projects moving to Gitlab now there's a lot of alternative projects but GitHub is still the only one that counts for discoverability. This is a very small minority of projects that move away from Github and it's way too early to declare GitHub doomed.
No different than everyone talking about the next “iPhone Killer” when someone other than Apple releases a phone. Although, I think that rhetoric has largely died down.
Github is fine for discoverability but as a development platform I think it's going to die. Public issues/PRs are a cesspool now and going to get worse, and agentic workflows are going to drive companies to want to hide how the sausage is made. People will gradually migrate to alternatives and mirror to Github while it remains relevant.
I always stayed with GitHub because it just worked the best. GitLab was slow and janky. gitea and its various forks lacked features and felt a step backwards. Sourcehut workflow is far too opinionated for my liking. Don't even get me started on GNU Savannah.
Some parts of the Free Software/Open Source crowd has always bemoaned the rise of GitHub, "because obviously you should use Free Software, its your ethical duty!" Most people just use what works best, including many Free Software devs. There is a loud minority (even louder in bubbles like HN) but for most people it's just one factor out of many, at best.
The reason GitHub became dominant is fairly simply: it just worked the best. Doesn't mean it was perfect (remember how long it took for line numbers to not be copied from code examples?) but the alternatives were even worse.
It's interesting to see how badly they're messing it up. You'd think that making a new react-based frontend for a fairly uncomplicated issue tracker wouldn't be too hard, but seems like it is. Some initial bugs after a rewrite are normal, but ... it's been a year? I still regularly just see closed issues in my issue overview. The back button is basically just broken. These are not obscure heisenbugs: these are bugs you find after using it for five minutes. The entire experience is just so janky.
I don't think Github is dying at this moment. I do think that the regression of UX quality is a necessary pre-condition for its death. Like many things its death will happen "very gradually, and then suddenly all at once". Sourceforge once seemed omnipresent and that changed very quickly.[1] But who knows where things will end up in five or ten years?
[1]: I'd like to pre-empt the inevitable "that's because of the adware" comment that someone always seems to post: that's a false history. The adware happened well after it already lost its position and was the desperate attempt of a declining struggling platform for income.
Different devs have different preferred ways to work and collaborate. I doubt the FOSS community will converge on a single solution. I think we’re at a point of re-decentralization, where devs will move their projects to the forge that satisfies their personal/group requirements for control, hosting jurisdiction, corporate vs community ownership, workflow, and uptime.
This is due to increasing competition in the source forge space. It’s good that different niches can be served by their preferred choice, even if it will be less convenient for devs who want to contribute a patch on a more obscure platform.
> I speculate that within a few months, the communities will have settled on a single dominant one.
The solutions on the roadmap are not centralized as GitHub. There is a real initiative to promote federation so we would not need to rely on one entity.
I love this, and hope it works out this way. Maybe another way to frame it: In 2 years, what will the "Learn Python for Beginners" tutorials direct the user towards? Maybe there will not be a consensus, but my pattern-matching brain finds one!
I really liked GitHub and I would also pay more for it, but that does not seem to be a priority. On safari the whole PR review is barely useable any longer because of bad performance without gaining any discoverable to me new features. Obviously a lot of man hours went in to ruining the product but I can’t understand why
GitLab is too heavyweight for many projects. It’s great for corporations or big organizations like GNOME, but it’s slow and difficult to administer. It has an important place in the ecosystem, but I doubt many small projects will choose it over simpler alternatives like Codeberg.
Gitlab is part of the reason I'm thinking along these lines: It has been around for a while, as a known, reasonably popular alternative to GitHub. So, I expected the announcement to be "We moved to GitLab", Yet, what I observe is "We moved to CodeHouse" or "We moved to Source-Base" The self-hosting here with mirrors to two one I'm not familiar with is another direction.
It looks like all that they’re doing is griping over frontends and interfaces to do all the custodial work other than version control (ie., all baked-in git provisions).
> Perhaps a well-known company or individual will announce one; it will have good marketing, and dominate.
Hah, exactly what we’re attempting with Tangled! Some big announcements to come fairly soon. We’re positioning ourselves to be the next social collab platform—focused solely on indies & communities.
Darcs & Pijul as the Patch Theory-based approach eliminates an entire class of merge conflicts which make working in a distributed manner more feasible. These DVCS have good (subjectively “better”) foundations & need better tooling like forges—unlike Git that already has a ton of things to choose from.
You don't need to, Jujutsu[1] gives you most of mercurial with some additional stuff. And relevant for this subthread, Tangled appears to work with it[2].
Off-topic, but as a non-native speaker I’m curious if it’s common to say “more and more slow” as opposed to “slower and slower” (maybe to emphasize the adjective?)
For me (native speaker), I would say "more slow" is incorrect grammar no matter how you use it, though people will know what you mean. So you should say "slower and slower".
But a writer may use the construct for style reasons.
ex: "Vaster than empires, and more slow."
I think it's just not actually technically wrong in the first place, merely uncommon because it's a little awkward.
What makes it ok is it can be used deliberately to build imagery or to intentionally trigger thought because you have to stop to parse it.
What makes it "wrong" is exactly that having to stop to parse it. It's obfuscated meaning to say "increase the decrease". A decrease is a property that may be increased, so it's legal, but you have to stop to puzzle it out rather than know what it means instantly without thought.
For real. I've been hearing the interface is slow and requires Javascript for years and never really paid much mind, it worked for me. But lately the page loading has gotten abusively slow. I don't think it can be simply blamed on React because that move was made long before this started.
I've taken to loading projects in github.dev for navigating repos so I pay the js tax just once and it's fine for code reading. But navigating PRs and actions is terrible.
I freely admit I am out of my depth and have nothing educational to add on the subject. I have but four things to add on this subject:
1. Oh! It's "d.i.l.l.o."! I misread that as something else.
2. After reading many comments in this thread, I must admit I am stupefied at the sheer amount of stuff that can go into merely setting up and maintaining a version control system for a project.
3. I have cited every one of the same problems OP enumerates as my argument for switching new projects over to self-hosted fossil. It also helps a good bit with #2 above when you're a small organization and you're the sole software engineer, sysadmin, and tier >1 support. It's a much simpler VCS that's closer to using perforce in my experience. YMMV, but it's the kind of VCS that doesn't qualify as a skill on a resume.
4. I also find GH deploy keys frustrating because I can't use the same key for multiple repositories. I have 3 separate applications that each run on 4 machines in my cluster, and I have to configure 12 separate deploy keys on GitHub and in my ~/.ssh/config file.
> To avoid this problem, I created my own bug tracker software, buggy, which is a very simple C tool that parses plain Markdown files and creates a single HTML page for each bug.
I love this. I used to be a big fan of linear (because the alternatives were dog water), but this also opened the question "why even have a seperate, disconnected tool?"
Most of my personal projects have a TODO.md somewhere with a list of things i need to work on. If people really need a frontend for bugs, it wouldn't be more than just rendering that markdown on the web.
Well, if your bugs can be specified clearly in plain text and plain text only, then yeah, I'd also advocate for this approach. Unfortunately, that's not really the case in any bigger software project. I need screenshots, video recordings that are 100 megs, cross-issue linking etc. I hate JIRA (of course) but it gets it right.
As for gitlab versus github: I understand that gitlab may have more features and thus options, but I can't stand its default UI. Every time I use it I am annoyed compared to the github variant. Perhaps gitlab is nicer to have for teams, but from a user's perspective, I feel gitlab is worse than github (UI-wise primarily).
I run a math circle in my area, and use forgejo with kids for thier solutions in Latex and python, it works great for me, and is super easy to enforce logins and reset passwords.
Another social issue on GitHub: you cannot use the "good first issue" tag on a public repository without being subjected to low quality drive-by PRs or AI slop automatically submitted by someone's bot.
I think the issue with centralization is still understated. I know developers who seem to struggle reading code if it's not presented by VS Code or a GitHub page. And then, why not totally capture everyone into developing just with GitHub Codespaces?
This is exactly what well-intentioned folk like to see: it's solving everyone's problems! Batteries included, nothing else is needed! Why use your own machine or software that doesn't ping into a telemetry hell-hole of data collection on a regular basis?
> GitHub has been useful to store all repositories of the Dillo project, as well as to run the CI workflows for platforms in which I don't have a machine available (like Windows, Mac OS or some BSDs).
The post does not mention CI anywhere else, are they doing anything with it, keeping it on GitHub, or getting rid of it?
> Furthermore, the web frontend doesn't require JS, so I can use it from Dillo (I modified cgit CSS slightly to work well on Dillo).
That sounds like a bad approach to developing a Web browser, surely it would be better to make Dillo correctly work with the default cgit CSS (which is used by countless projects)?
No doubt this is desirable. However, adding all the CSS features required to support cgit may have been a lot more work than editing cgit's CSS. It's an attempt at avoiding yak shaving; adding recursive sub-projects that balloon a project's scope of work far beyond the original plan.
Dillo is actively developed, and the project of "migrate away from github" is complete, so now other work can be started and completed (like adding the CSS features required to support mainline cgit).
This was the part that mystified me. Love it or hate it, GitHub Actions is free. Alternative providers like Codeberg have much tighter limits on it, and it sounds unlikely the author's solution includes CI at all.
>frontend barely works without JavaScript, ... In the past, it used to gracefully degrade without enforcing JavaScript, but now it doesn't.
And the github frontend developers are aware of these accessibility problems (via the forums and bug reports). They just don't care anymore. They just want to make the site appear to work at first glance which is why index pages are actual text in html but nothing else is.
I'd love to hear the inside story of GitHub's migration of their core product features to React.
It clearly represents a pretty seismic cultural change within the company. GitHub was my go-to example of a sophisticated application that loaded fast and didn't require JavaScript for well over a decade.
The new React stuff is sluggish even on a crazy fast computer.
My guess is that the "old guard" who made the original technical decisions all left, and since it's been almost impossible to hire a frontend engineer since ~2020 or so that wasn't a JavaScript/React-first developer the weight of industry fashion became too much to resist.
But maybe I'm wrong and they made a technical decision to go all-in on heavy JavaScript features that was reasoned out by GitHub veterans and accompanied by rock solid technical justification.
GitHub have been very transparent about their internal technical decisions in the past. I'd love to see them write about this transition.
> But beyond accessibility and availability, there is also a growing expectation of GitHub being more app-like.
> The first case of this was when we rebuilt GitHub projects. Customers were asking for features well beyond our existing feature set. More broadly, we are seeing other companies in our space innovate with more app-like experiences.
> Which has led us to adoption React. While we don’t have plans to rewrite GitHub in React, we are building most new experiences in React, especially when they are app-like.
> We made this decision a couple of years ago, and since then we’ve added about 250 React routes that serve about half of the average pages used by a given user in a week.
It then goes on to talk about how mobile is the new baseline and GitHub needed to build interfaces that felt more like mobile apps.
(Personally I think JavaScript-heavy React code is a disaster on mobile since it's so slow to load on the median (Android) device. I guess GitHub's core audience are more likely to have powerful phones?)
For contrast, gitea/forgejo use as little JavaScript as possible, and have been busy removing frontend libraries over the past year or so. For example, jquery was removed in favor of native ES6+.
Let them choke on their "app-like experience", and if you can afford it, switch over to either one. I cannot recommend it enough after using it "in production" daily for more than five years.
I honestly believe that the people involved likely already wanted to move over to React/SPAs for one reason or another, and were mostly just searching for excuses to do so - hence these kind of vague and seemingly disproportional reasons. Mobile over desktop? Whatever app-like means over performance?
Non-technical incentives steering technical decisions is more common than we'd perhaps like to admit.
What's nuts about that presentation is that the github frontend has gone from ~.2 to >2 Million lines of code in the last 5-6 years. 10x the code... to get slower?
That also means a much larger team and great possibilities for good perf reviews, so basically an excellent outcome in a corporate env. People follow incentives.
To be fair, the developers might care, but upper management certainly doesn't, and they're the ones who decide if those developers make their rent this month.
It's 1 step forward 2 steps back with this "server side rendering" framing of the issue and in practice observing Microsoft Github's behaviors. They'll temporarily enable text on the web pages of the site in response to accessibility issues then a few months later remove it on that type of page and even more others. As that thread and others I've participated in show this is a losing battle. Microsoft Github will be javascript application only in the end. Human people should consider moving their personal projects accordingly. For work, well one often has to do very distasteful and unethical things for money. And github is where the money is.
Although I'm not a fan of GH, I appreciate the ability to see how popular/valid some project is by looking at the number of stars (I know this is far from a perfect signal).
I'm much less likely to try projects that have a low number of stars, or projects in different places.
One of the things Forgejo has been working on is federation, such that hopefully someday we can replicate the discoverability of GitHub without a central provider.
> the [GitHub] frontend barely works without JavaScript, so we cannot open issues, pull requests, source code or CI logs in Dillo itself, despite them being mostly plain HTML
because Dillo is a simple browser without a JS engine. And that is a perfectly valid reason to leave GitHub.
What would be nice is an aggregator site one could submit to and everyone just host it on their own internet connection, and nobody be dependent on a source for hosting their projects. Maybe something like bluesky with the AT protocol but with git repositories.
Have a look to [Fossil](https://fossil-scm.org/) which is very easy to host and offer code repository, bug tracker, wiki, forum, etc. It is not Git however, but there is bridges and one can even mirror a Fossil repo to Github.
There are ways around some of the issues there, such as using the GitHub API (I almost exclusively use the API), and/or using a user script (see below). Furthermore, on GitHub and on some other version control hosting services (such as GitLab), you can change "blob" to "raw" in the URL to access the raw files. However, as they say, it can be mirrored on multiple services (including self-hosting), and this would be a good idea, whether or not you use GitHub, so if you do not like GitHub then you do not have to use it.
Note that for some of the web pages on GitHub, the data is included as JSON data within the HTML file, although this schema is undocumented and sometimes changes. User scripts (which you might have to maintain due to these changes) can be used to display the data without any additional downloads from the server, and they can be much shorter and faster than GitHub's proprietary scripts.
Using a GPG key to sign the web page and releases is helpful (for the reasons they explain there), although there are some other things that might additionally help (if the conspiracy was not making it difficult to do these things with X.509 certificates in many ways).
GitHub frontend is mostly still their own [1] Web Components based library. They use Turbo to do client side reloading.
They have small islands of React based views like Projects view or reworked Pull Request review.
The thing is, even if you disable JavaScript, sites still load sloow. Try it yourself. Frontend code doesn’t seem to be the bottleneck.
> it is a single point of failure. I don't mean that GitHub is stored in a single machine, but it is controlled by a single entity which can unilateraly ban our repository or account
Besides the usability problems with bloated MS Shitware, this is a strategic point that I see considered far too rarely for my taste.
> This is specially problematic when active issues with developer notes begin to be filled with comments from users that have never contributed to the project and usually do more harm than good. This situation ends up causing burnout in developers.
This is a huge with github and I would desperately want a github where issues are visible, but the ability to create or modify them is STRICTLY locked down to project members, and non members can only post Discussions for problems. I really prefer that I write the issue(s) based an "intake discussion" so to speak.
I hope you will continue maintaining a mirror in GH. Some tools like deepwiki are excellent resources to learn about a codebase when their is not much documentation going around. But these tools only support pulling from GH.
A neat thing about GitHub is that every file on it can be accessed from URLs like https://raw.githubusercontent.com/simonw/llm-prices/refs/hea... which are served through a CDN with open CORS headers - which means any JavaScript application running anywhere can access them.
It's also not being served via a caching CDN, which means I don't feel comfortable running anything automated against it as that might add load to the server that they aren't ready for.
It's less about pulling and more about tools like DeepWiki making the assumption that its inputs live in GitHub, so repository URLs are expected to be GH URLs as opposed to a URL to a git repository anywhere.
That being said, there's no reason for tools like it to have those constraints other than pushing users into an ecosystem they prefer (i.e. GitHub instead of other forges).
I've been messing around with GitLab as a self hosted alternative for a few years. I do like it, but it is resource intensive!
For the past few days I've been playing with Forgejo (from the Codeberg people). It is fantastic.
The biggest difference is memory usage. GitLab is Ruby on Rails and over a dozen services (gitlab itself, then nginx, postgrest, prometheus, etc). Forgejo is written in go and is a single binary.
I have been running GitLab for several years (for my own personal use only!) and it regularly slowly starts to use up the entirety of the RAM on a 16GB VM. I have only been playing with Forgejo for a few days, but I am using only 300MB of the 8 GB of RAM I allocated, and that machine is running both the server and a runner (it is idle but...).
I'm really excited about Forgejo and dumping GitLab. The biggest difference I can see if that Forgejo does not have GraphQL support, but the REST API seems, at first glance, to be fine.
EDIT: I don't really understand the difference between gitea and forgejo. Can anyone explain? I see lots of directories inside the forgejo volume when I run using podman that clearly indicate they are the same under the hood in many ways.
EDIT 2: Looks like forgejo is a soft fork in 2022 when there were some weird things that happened to governance of the gitea project: https://forgejo.org/compare-to-gitea/#why-was-forgejo-create...
> I'm really excited about Forgejo
Our product studio with currently around 50 users who need daily git access moved to a self hosted forgejo nearly 2 years ago.
I really can’t overstate the positive effects of this transition. Forgejo is a really straightforward Go service with very manageable mental model for storage and config. It’s been easy and cheap to host and maintain, our team has contributed multiple bugfixes and improvements and we’ve built a lot of internal tooling around forgejo which otherwise would’ve required a much more elaborate (and slow) integration with GitHub.
Our main instance is hosted on premise, so even in the extremely rare event of our internet connection going offline, our development and CI workflows remain unaffected (Forgejo is also a registry/store for most package managers so we also cache our dependencies and docker images).
Wait, forgejo offers a built-in container registry? How does that work? I don't see that in the admin section at all.
Container registry and a lot more, they call it Package registry in the docs https://forgejo.org/docs/latest/user/packages/
Is it not the same as in Gitea? https://docs.gitea.com/usage/packages
edit: Ok, this answers my question: https://forgejo.org/compare-to-gitea/#is-there-a-list-of-fea...
Just run podman or docker login your.forgejo.instance.address then push to it as normal. An existing repo must exist. You can check the images under site administration -> packages.
Speaking of authentication it also works as an openid provider meaning you can authenticate every other web software that supports it to Forgejo... which in turn can look for users in other sources.
It also has wikis.
Its an underrated piece of software that uses a ridiculous small amount of computer resources.
That's so brilliant. Wow. I'm struggling to wrap my brain around how they not only support OCI (docker) but also APK (alpine) and APT (debian) packages. That's a very cool feature.
"Forgejo is also a registry/store for most package managers"
Do you know if it supports OpenWRT packages?
Since they support Alpine, and the recent switch of OpenWRT to the wonderful alpine apk package manager, I guess it is supported.
Ease of maintenance is an even bigger difference. We've been using gitea for a bit over five years now, and gitlab for a few years before that, and gitea requires no maintenance in comparison. Upgrades come down to pulling the new version and restarting the daemon, and take just a few seconds. It's definitely the best solution for self-hosters who want to spend as little time as possible on their infrastructure.
Backups are handled by zfs snapshots (like every other server).
We've also had at least 10× lower downtime compared to github over the same period of time, and whatever downtime we had was planned and always in the middle of the night. Always funny reading claims here that github has much better uptime than anything self-hosted from people who don't know any better. I usually don't even bother responding anymore.
I guess I'll just chime in that while Gitlab is a very heavy beast, I have self hosted it for over a decade with little to no issues. It's pretty much as simple as installing their Omnibus package repository and doing apt install gitlab-ce.
When I self hosted gitlab I never found the maintenance to be that bad, just change a version in a compose.yml, sometimes having to jump between blessed versions if I've missed a few back to back.
Like others, I've switch to Gitea, but whenever I do visit gitlab I can't help but think the design / UX is so much nicer.
My usual impression of GitLab is that it has too many functions I don't ever use, so the things I actually do want (code, issues, PRs, user permissions) are needlessly hidden. What's your workflow that you find GitLab's UX to be nicer than Gitea's?
For instance I just got tripped trying to sign out of my gitea instance since the mobile design has two identical looking avatar + username blocks on top of each other, one being the org switcher the other being a menu (with no indicator) with the sign out button.
I went to a project page, and it auto focused the search input (???), causing a zoom in on mobile.
I just prefer the design / look + feel of gitlab more than gitea/forejo. It's not really a hot take, gitlab has been around a lot longer and has much more support.
That was my take too. It is a big project with a lot of functionality. But, I never needed all of that functionality, so it just seemed bloated to me. I switched over to Gitea for self-hosted code repositories (non-public repos behind a firewall) a while back and haven't had any issues thus far.
You can pin those you want in the left menu
I found gitea's interface to be so unusably bad that i switched to full-fat GitLab.
Gitea refused to do some perfectly sensible action- I think it had something to do with creating a fork of my own repo. Looking online, there's zero technical reason for this, and the explanation given was "this is how GitHub does things". Immediately uninstalled. I'm not here for this level of disrespect.
I found gitea's interface to be so unusably bad that i switched to full-fat GitLab.
Was this Gitea pre-UI redesign or after? 1.23 introduced some major UI overhauls, with additional changes in the following releases. Forejo currently represents the Gitea 1.22 UI, reminiscent of earlier GitHub design.
I find Gerrit to also be very low maintenance.
If you have a high-availability or multi-site set-up you also don't need to take any downtime to upgrade.
https://forgejo.org/docs/latest/user/actions/basic-concepts/
It's a shame that GitHub won the CI race by sheer force of popularity and it propagates its questionable design decisions. I wish more VCS platforms would base their CI systems on Gitlab, which is much much better than GitHub actions.
The CI job definitions for sourcehut are a pleasure to use: https://man.sr.ht/builds.sr.ht/manifest.md
A really neat feature is that you can also trigger a job by just submitting a yaml file (with the web interface, the API or the cli) without needing to push a commit for each job. This is neat for infrequent tasks, or for testing CI manifests before committing them.
Both are yaml jungles, I hate them equally.
What exactly is the advantage of running something like GitLab vs what I do which is just a server with SSH and a file system? To create a new repo I do:
Then I just set my remote origin URL to example.com:repos/my-proj.gitThe filesystem on example.com is backed up daily. Since I do not need to send myself pull requests for personal projects and track my own TODOs and issues via TODO.md, what exactly am I missing? I have been using GitHub for open source projects and work for years but for projects where I am the only author, why would I need a UI besides git and my code editor of choice?
CI runners would be the main advantage of GitLab over bare Git, I think. Also if you want to show other people your personal project at some point, it may be nice to be able to link to a diff or a historical version of a file that they can see in a browser. Or just a syntax-highlighted file or a rendered Markdown or Jupyter file. Also previous release tarballs.
What exactly is the advantage of running something like a restaurant vs what I do at home which is just cook it myself?
-> convenience, collaboration, mobility
Personal projects that you work on by yourself do not need collaboration. I feel like I pretty clearly implied that in my comment.
We at $DAYJOB had an internal git server that was literally what the parent of this comment mentioned (`git init --bare`). It became a little cumbersome, so when I stumbled across forgejo, I was happy to see that importing the existing git repos was a breeze, just had to point the config to look at the existing git storage root and assign groups and permissions via the GUI.
Collaboration and specifically collaboration with non git nerds. That's primarily what made GitHub win the VCS wars back in the day. The pull request model appealed to anyone who didn't want to learn crafting and emailing patches.
Yes, it's the PRs, and there is a misunderstanding I think because the OP and the GP's use-cases are quite different. Self-hosting your own repository on a remote server (and perhaps sharing it with 1 or 2 collaborators) is simple but quite different than running a public open source project that solicits contributions.
I specifically was talking about “personal projects” and excluded PRs for the reason that I would be the only contributor.
I’d argue they if you can’t prepare a patch diff then your abilities as a contributing developer should be thoroughly questioned.
Yes, the projects that are emailing patches around generally have a much higher bar then the ones that accept GitHub PRs, but whatever works for a given project I guess
You don’t! Forges are for collaboration outside of the rhythm of git commits. You’re happy to make a new commit every time you have something to add to an issue. With X issues and Y comments a hour, polluting the git timeline with commentary is going to become unhelpful.
Some forges even include(d) instant messaging!
https://secure.phabricator.com/Z1336
This is kind of like asking what the point of Dropbox is when we have rsync. Rsync is nice, but most people won't know how to use it.
Setting up a server with SSH and GitLab is more work than setting up a server with SSH. Dropbox is great and I use it but only because I can’t get the same functionality out of rsync without major additional orchestration. But if I am the only one working on my own project why would I need a second read-only UI for my own code?
If you're working alone you can also send raw IP packets down the wire by way of telegraph key if you'd like. What you do alone behind closed doors isn't really anyone's business and is up to you. For everyone else, the benefit of using Gitlab is that once it's set up, a wide range of users of varying skill levels and backgrounds can use it to collaborate.
> why would I need a UI besides git and my code editor of choice?
If you ever find yourself wishing for a web UI as well, there's cgit[1]. It's what kernel.org uses[2].
[1]: https://git.zx2c4.com/cgit/ [2]: https://git.kernel.org/pub/scm/linux/kernel/git/torvalds/lin...
I wish I could search in Gitlab like I could in Jira's SQL-esque query syntax, but otherwise its interface is a step-up, if still pretty "busy" for my taste.
If you want even more minimal, Gerrit is structured as a Java app with no external dependencies like databases, and stores all it's configuration and runtime information on the filesystem, mostly as data structures in the git repos.
Shared filesystems is all you need to scale/replicate it, and it also makes the backup process quite simple.
I might be one of the few that is intrigued by this being that it’s Java but this looks really neat. Does it do git repositories like gitea, GitHub, etc, or is it more of a project management site for the repositories? They describe it as “code review”, so I wasn’t sure.
I’m a little put off on the google connection but it seems like it could run rather independently.
It necessarily hosts a git server (using jgit), but the primary interface is as a code review tool.
even browsing the git repos it hosts uses an embedded version of another tool (gitiles).
https://gerrithub.io/ is a public instance
It's hyper-focused on code review and CI integration, which it does really well.
It's not focused on all the other stuff that people think of in code forges (hosting the README in a pretty way, arbitrary page hosting, wiki, bug tracking, etc.) but can be integrated with 3rd party implementations of those fairly trivially.
> I’m a little put off on the google connection but it seems like it could run rather independently.
Yeah, its actually a really healthy open-source project, google contributes usually around 40% of the code, but you have other companies like GerritForge(disclaimer, I work here), Nvidia, SAP, Qualcomm, Wikimedia foundation, all contributing heavily to it.
The deployment may be simple, but at the same time, the Gerrit code review workflow is terrible.
Coming from Github myself, I cannot imagine going back to it after using Gerrit for even just a few days.
The workflow in Gerrit really makes a lot of sense, unfortunately its the workflow in GitHub that has screwed up everyone's idea of what code review should look like[1], even by one of GitHub's co-founder own's admission.
[1] https://medium.com/@danielesassoli/how-github-taught-the-wor...
I personally find the rebase and stacking commit focused method of integration that Gerrit uses to be easier and cleaner than PR's in GitHub.
Having done CI integrations with both, Gerrit's APIs send pre- and post-merge events through the same channel, instead of needing multiple separate listeners like GitHub.
We've been looking at Forgejo too. Do you have any experience with Forgejo Actions you can share? That is one thing we are looking at with a little trepidation.
I setup actions yesterday. There are a few tiny rough edges, but it is definitely working for me. I'm using it to build my hugo blog which "sprinklylls" in a Svelte app, so it needs to have nodejs + hugo and a custom orchestrator written in Zig.
What I did:
> * Strange error: "Error: Open(/home/runner/.cache/actcache/bolt.db): timeout"
This will occur if you have a `forgejo-runner daemon` running while you try to use `exec` -- both are trying to open the cache database, and only the first to open it can operate. You could avoid this by changing the cache directory of the daemon by changing `cache.dir` in the config file, or run the two processes as different users.
> It's a bit strange there are two files IMHO.
The `.runner` file isn't a config file, it's a state file -- not intended for user editing. But yes, it's a bit odd.
We use them in our shop. It's quite straightforward if you're already familiar with Github Actions. The Forgejo runner is tiny and you can build it even on unsupported platforms (https://code.forgejo.org/forgejo/runner) e.g. we've setup our CI to also run on Macs (by https://www.oakhost.net) for App Store related builds. It's really quite a joy :)
Are you building MacOS apps? More specifically, are you doing code signing and notarization and stamping within CI? If so, is this written up somewhere? I really struggled with getting that working on GitLab. I did have it working, but was always searching for alternatives.
One concern the post brings up - single point of failure. Yes, in this case, blah blah big company microsoft blah blah (I don't disagree, but..). I'm more worried about places like Paypal/Google/etc banning than the beast from Redmond.
Self hosting, it's still a single point of failure and the article arguing "mirroring", well... it allows redundancy with reads but writes?
It's an interesting take on a purist problem.
Redundancy for read access to the source code is a concern for Dillo. Some years ago, the domain name registration lapsed, and was promptly bought by an impersonator, taking the official repository offline. If it hadn't been for people having clones of the repository, the source code and history would have been lost.
How do people find your online project and know it's you (instead of an impersonator) without relying on an authority, like GitHub accounts or domain names? It is a challenging problem with no good solution. At least now the project is alive again and more resilient than before.
I think it’s a fair concern, e.g. forgejo is a simple directory on disk, with an option to make that into an S3 storage. It really is a no brainer to set that up for as much resilience as necessary with various degrees of “advanced” depending on your thread model and experience. The lack of a FAANG/M in the equation makes it even more palatable.
I found the banning comment to be odd. That said, all it really takes is a policy change (something that I see as far more likely in Microsoft's case) or simply a change in the underlying software (again, somewhat likely with Microsoft) for the platform to become unusable for them. Keep in mind that Dillo is a browser for those who can't on don't want to fit into the reality of the modern web.
I used, administered, setup, and customized many on prem gitlab instances for years. I gitlab doesn't memory leak, you're making that up. It's exactly as resource intensive as the number of resources you setup. Can't say the same for JIRA et al.
This comment makes me suspect this entire thread as some astroturfing for that other product.
> To avoid this problem, I created my own bug tracker software, buggy, which is a very simple C tool that parses plain Markdown files and creates a single HTML page for each bug.
The hacker spirit alive and well.
got here to comment on that, hella impressive and looks so simple and clean.
That is an approach very few people would take. I would never do it, as I am sure it would cause me trouble than any potential benefit.
That’s ok. Not everyone can be a chad C developer.
For better and for worse.
> GitHub seems to encourage a "push model" in which you are notified when a new event occurs in your project(s), but I don't want to work with that model. Instead, I prefer it to work as a "pull model", so I only get updates when I specifically look for them.
I agree with the sentiment, but want to point out that email can be used to turn push into pull, by auto-filtering the respective email notifications into a separate dedicated email folder, which you can choose to only look at when you want.
And if that’s unsatisfactory, GitHub has its own notifications part of the UI.
This is in search of a problem.
> Additionally, GitHub seems to encourage a "push model" in which you are notified when a new event occurs in your project(s), but I don't want to work with that model. Instead, I prefer it to work as a "pull model", so I only get updates when I specifically look for them. This model would also allow me to easily work offline. Unfortunately, I see that the same push model has been copied to alternative forges.
Someone kind enough to explain this to me? What's the difference between push model and pull model? What about push model makes it difficult to work offline?
AFAIK, the author wants to work like how Source Hut and Linux kernel works: by e-mails.
When you're working with e-mails, you sync your relevant IMAP box to local, pulling all the proposed patches with it, hence the pull model.
Then you can work through the proposed changes offline, handle on your local copy and push the merged changes back online.
I would love to see more projects use git-bug, which works very well for offline collaboration. All bug tracker info is stored in the repo itself. https://github.com/git-bug/git-bug
It still needs work to match the capabilities of most source forges, but for small closed teams it already works very well.
Reminder that POP and IMAP are protocols, and nothing stops a code forge—or any other website—from exposing the internal messaging/notification system to users as a service on the standard IMAP ports; no one is ever required to set up a bridge/relay that sends outgoing messages to, say, the user's Fastmail/Runbox/Proton/whatever inbox. You can just let the user point their IMAP client to _your_ servers, authenticate with their username and password, and fetch the contents of notifications that way. You don't have to implement server-to-server federation typically associated with email (for incoming messages), and you don't have to worry about deliverability for outgoing mail.
All of this makes sense. Thank you for explaining. I don't think I understand the difference though.
Like are they calling the "GitHub pull request" workflow as the push model? What is "push" about it though? I can download all the pull request patches to my local and work offline, can't I?
GitHub pull request pushes you a notification/e-mail to handle the merge, and you have to handle the pull request mostly online.
I don't know how you can download the pull request as a set of patches and work offline, but you have to open a branch, merge the PR to that branch, test the things and merge that branch to relevant one.
Or you have to download the forked repository, do your tests to see the change is relevant/stable whatnot and if it works, you can then merge the PR.
---
edit: Looks like you can get the PR as a patch or diff, and is trivial, but you have to be online again to get it that way. So, getting your mails from your box is not enough, you have to get every PR as a diff, with a tool or manually. Then you have to organize them. e-mails are much more unified and simple way to handle all this.
---
In either case, reviewing the changes is not possible when you're offline, plus the pings of the PRs is distracting, if your project is popular.
Seems like you found it, but for others: one of the easiest ways to get a PR's diff/patch is to just put .diff or .patch at the end of its URL. I use this all the time!
Random PR example, https://github.com/microsoft/vscode/pull/280106 has a diff at https://github.com/microsoft/vscode/pull/280106.diff
Another thing that surprises some is that GitHub's forks are actually just "magic" branches. I.e the commits on a fork exist in the original repo: https://github.com/microsoft/vscode/commit/8fc3d909ad0f90561...
It’s bonkers to me that there isn’t a link to the plan patch from the page. Yes, it’s trivial to add a suffix once you know, but lots of people don’t—as evidenced by this thread.
Discoverability in UX seems to have completely died.
> It’s bonkers to me that there isn’t a link to the plan patch from the page.
It's yet another brick on the wall of the garden. That's left there for now, but for how long?
IOW, It's deliberate. Plus, GitHub omits to add trivial features (e.g.: deleting projects, "add review" button, etc.) while porting their UI.
It feels like they don't care anymore.
You could set up a script that lives in the cloud (so you don't have to), receives PRs through webhooks, fetches any associated diff, and stores them in S3 for you to download later.
Maybe another script to download them all at once, and apply each diff to its own own branch automatically.
Almost everything about git and github/gitlab/etc. can be scripted. You don't have to do anything on their website if you're willing to pipe some text around the old way.
Why complicate the workflow when it can be solved with a simple e-mail?
> Almost everything about git and github/gitlab/etc. can be scripted.
Moving away from GitHub is more philosophical than technical at this point. I also left the site the day they took Copilot to production.
I would say it is time/life management: push tells you to do something now. In pull I check each Friday afternoon what's up in my hobby project and work on it for a few hours and then call it a day and be uninterrupted till next week.
Yep, thats what I meant :)
We are in the disapora phase; there is a steady stream of these announcements, each with a different GitHub alternative. I speculate that within a few months, the communities will have settled on a single dominant one. I'm curious if it will be one of the existing ones, or something new. Perhaps a well-known company or individual will announce one; it will have good marketing, and dominate.
This has been going on for a decade, at the beginning it was projects moving to Gitlab now there's a lot of alternative projects but GitHub is still the only one that counts for discoverability. This is a very small minority of projects that move away from Github and it's way too early to declare GitHub doomed.
No different than everyone talking about the next “iPhone Killer” when someone other than Apple releases a phone. Although, I think that rhetoric has largely died down.
Github is fine for discoverability but as a development platform I think it's going to die. Public issues/PRs are a cesspool now and going to get worse, and agentic workflows are going to drive companies to want to hide how the sausage is made. People will gradually migrate to alternatives and mirror to Github while it remains relevant.
I’d guess most revenue comes from enterprise accounts which are not public.
I always stayed with GitHub because it just worked the best. GitLab was slow and janky. gitea and its various forks lacked features and felt a step backwards. Sourcehut workflow is far too opinionated for my liking. Don't even get me started on GNU Savannah.
Some parts of the Free Software/Open Source crowd has always bemoaned the rise of GitHub, "because obviously you should use Free Software, its your ethical duty!" Most people just use what works best, including many Free Software devs. There is a loud minority (even louder in bubbles like HN) but for most people it's just one factor out of many, at best.
The reason GitHub became dominant is fairly simply: it just worked the best. Doesn't mean it was perfect (remember how long it took for line numbers to not be copied from code examples?) but the alternatives were even worse.
It's interesting to see how badly they're messing it up. You'd think that making a new react-based frontend for a fairly uncomplicated issue tracker wouldn't be too hard, but seems like it is. Some initial bugs after a rewrite are normal, but ... it's been a year? I still regularly just see closed issues in my issue overview. The back button is basically just broken. These are not obscure heisenbugs: these are bugs you find after using it for five minutes. The entire experience is just so janky.
I don't think Github is dying at this moment. I do think that the regression of UX quality is a necessary pre-condition for its death. Like many things its death will happen "very gradually, and then suddenly all at once". Sourceforge once seemed omnipresent and that changed very quickly.[1] But who knows where things will end up in five or ten years?
[1]: I'd like to pre-empt the inevitable "that's because of the adware" comment that someone always seems to post: that's a false history. The adware happened well after it already lost its position and was the desperate attempt of a declining struggling platform for income.
Gitlab did seem like a hope. But they very quickly became an even more massive and slow SPA javascript app than even github was.
Different devs have different preferred ways to work and collaborate. I doubt the FOSS community will converge on a single solution. I think we’re at a point of re-decentralization, where devs will move their projects to the forge that satisfies their personal/group requirements for control, hosting jurisdiction, corporate vs community ownership, workflow, and uptime.
This is due to increasing competition in the source forge space. It’s good that different niches can be served by their preferred choice, even if it will be less convenient for devs who want to contribute a patch on a more obscure platform.
> I speculate that within a few months, the communities will have settled on a single dominant one.
The solutions on the roadmap are not centralized as GitHub. There is a real initiative to promote federation so we would not need to rely on one entity.
I love this, and hope it works out this way. Maybe another way to frame it: In 2 years, what will the "Learn Python for Beginners" tutorials direct the user towards? Maybe there will not be a consensus, but my pattern-matching brain finds one!
The bigger question is whether we want a single dominant replacement, or whether it just means we'll be back in the same place in 5 years.
> I speculate that within a few months, the communities will have settled on a single dominant one.
I really hope not. Heterogeneity is really valuable in this space, and there's really no "one size fits all" model.
I really liked GitHub and I would also pay more for it, but that does not seem to be a priority. On safari the whole PR review is barely useable any longer because of bad performance without gaining any discoverable to me new features. Obviously a lot of man hours went in to ruining the product but I can’t understand why
On the upside I guess they use git internally as well so maybe they could just find a usable commit and revert all the crap they changed
Isn't that pretty much GitLab? But then most people still prefer GitHub anyway.
GitLab is too heavyweight for many projects. It’s great for corporations or big organizations like GNOME, but it’s slow and difficult to administer. It has an important place in the ecosystem, but I doubt many small projects will choose it over simpler alternatives like Codeberg.
Gitlab is worse than GitHub in every way.
At least GitHub adds new features over time.
Gitlab has been removing features in favor of more expensive plans even after explicitly saying they wouldn’t do so.
> At least GitHub adds new features over time.
Not as quickly as they add anti-features, imho.
Gitlab works fine for me. Been using it at work for a few years and recently moved all my personal repos there
Personally, I prefer the CI/CD setup on GitLab over GitHub Actions.
Horses for courses I guess ¯\_(ツ)_/¯
Gitlab is part of the reason I'm thinking along these lines: It has been around for a while, as a known, reasonably popular alternative to GitHub. So, I expected the announcement to be "We moved to GitLab", Yet, what I observe is "We moved to CodeHouse" or "We moved to Source-Base" The self-hosting here with mirrors to two one I'm not familiar with is another direction.
I think people are wary of moving to gitlab because its a similarly large platform and dont want to repeat their mistakes
gitlab has also gone full slop
It looks like all that they’re doing is griping over frontends and interfaces to do all the custodial work other than version control (ie., all baked-in git provisions).
How do you speculate the candidacy for email.
The settling on a dominant one does not happen - self-hosting becomes more popular.
> Perhaps a well-known company or individual will announce one; it will have good marketing, and dominate.
Hah, exactly what we’re attempting with Tangled! Some big announcements to come fairly soon. We’re positioning ourselves to be the next social collab platform—focused solely on indies & communities.
... and it will be SourceForge. finally.
We’d love to have the Dillo project on Tangled! ;) https://tangled.org
Nice to see it works with no JS
I wish Tangled supported alternatives to Git
Just curious, what are you using instead of git, and why? :)
Darcs & Pijul as the Patch Theory-based approach eliminates an entire class of merge conflicts which make working in a distributed manner more feasible. These DVCS have good (subjectively “better”) foundations & need better tooling like forges—unlike Git that already has a ton of things to choose from.
Darcs? Fossil? Subversion?
On their home page they have links to some repositories (or equivalent term) on a couple of Darcs hosts.
https://hub.darcs.net
https://smeder.ee
Here's a recent discussion about Darcs, it's the first I've heard of it.
Darcs, Friendly Version Control - https://news.ycombinator.com/item?id=43022059 (9 months ago | 76 comments)
Darcs is older than Git
Most version control systems are. At some point Linus said "let me show you how it's done", made git, and there haven't been many attempts since.
i still kind of miss mercurial.
You don't need to, Jujutsu[1] gives you most of mercurial with some additional stuff. And relevant for this subthread, Tangled appears to work with it[2].
[1] https://docs.jj-vcs.dev/latest/
[2] https://blog.tangled.org/stacking
> On the usability side, the platform has become more and more slow over time
The best reason right here.
Off-topic, but as a non-native speaker I’m curious if it’s common to say “more and more slow” as opposed to “slower and slower” (maybe to emphasize the adjective?)
For me (native speaker), I would say "more slow" is incorrect grammar no matter how you use it, though people will know what you mean. So you should say "slower and slower".
But a writer may use the construct for style reasons. ex: "Vaster than empires, and more slow."
I think it's just not actually technically wrong in the first place, merely uncommon because it's a little awkward.
What makes it ok is it can be used deliberately to build imagery or to intentionally trigger thought because you have to stop to parse it.
What makes it "wrong" is exactly that having to stop to parse it. It's obfuscated meaning to say "increase the decrease". A decrease is a property that may be increased, so it's legal, but you have to stop to puzzle it out rather than know what it means instantly without thought.
Slower and slower is more natural to my ear. More and more slow sounds weird.
Linguistic decay. First they came for our adverbs, now they're attacking our comparative adjectives.
For real. I've been hearing the interface is slow and requires Javascript for years and never really paid much mind, it worked for me. But lately the page loading has gotten abusively slow. I don't think it can be simply blamed on React because that move was made long before this started.
I've taken to loading projects in github.dev for navigating repos so I pay the js tax just once and it's fine for code reading. But navigating PRs and actions is terrible.
I freely admit I am out of my depth and have nothing educational to add on the subject. I have but four things to add on this subject:
1. Oh! It's "d.i.l.l.o."! I misread that as something else.
2. After reading many comments in this thread, I must admit I am stupefied at the sheer amount of stuff that can go into merely setting up and maintaining a version control system for a project.
3. I have cited every one of the same problems OP enumerates as my argument for switching new projects over to self-hosted fossil. It also helps a good bit with #2 above when you're a small organization and you're the sole software engineer, sysadmin, and tier >1 support. It's a much simpler VCS that's closer to using perforce in my experience. YMMV, but it's the kind of VCS that doesn't qualify as a skill on a resume.
4. I also find GH deploy keys frustrating because I can't use the same key for multiple repositories. I have 3 separate applications that each run on 4 machines in my cluster, and I have to configure 12 separate deploy keys on GitHub and in my ~/.ssh/config file.
> To avoid this problem, I created my own bug tracker software, buggy, which is a very simple C tool that parses plain Markdown files and creates a single HTML page for each bug.
I love this. I used to be a big fan of linear (because the alternatives were dog water), but this also opened the question "why even have a seperate, disconnected tool?"
Most of my personal projects have a TODO.md somewhere with a list of things i need to work on. If people really need a frontend for bugs, it wouldn't be more than just rendering that markdown on the web.
> As it is simply plain text
Well, if your bugs can be specified clearly in plain text and plain text only, then yeah, I'd also advocate for this approach. Unfortunately, that's not really the case in any bigger software project. I need screenshots, video recordings that are 100 megs, cross-issue linking etc. I hate JIRA (of course) but it gets it right.
Even in the case of Dillo, the migrated bugs from GitHub include ZIP files (that are still hosted on GitHub): https://bug.dillo-browser.org/50/
Sure, for personal stuff i mostly remember the issue. I can even provide which code snippet it is. For communicating with others that wouldn't fly
We can enter e-mail's h*ll and just have attachments be base64 blobs.
If anyone wants to add Forgejo to your VM, I made a script that allows you to quickly install server + runner, so you get the full setup:
https://wkoszek.github.io/easyforgejo/
Isn't there one 'd' missing ...
As for gitlab versus github: I understand that gitlab may have more features and thus options, but I can't stand its default UI. Every time I use it I am annoyed compared to the github variant. Perhaps gitlab is nicer to have for teams, but from a user's perspective, I feel gitlab is worse than github (UI-wise primarily).
I run a math circle in my area, and use forgejo with kids for thier solutions in Latex and python, it works great for me, and is super easy to enforce logins and reset passwords.
Always nice to see a project moving to self-hosted git repos rather than the other direction.
https://sfconservancy.org/GiveUpGitHub/
How about using tor to help with DNS redundancy? https://en.wikipedia.org/wiki/Tor_(network)
Excellent. I hope to see more of it.
Another social issue on GitHub: you cannot use the "good first issue" tag on a public repository without being subjected to low quality drive-by PRs or AI slop automatically submitted by someone's bot.
I think the issue with centralization is still understated. I know developers who seem to struggle reading code if it's not presented by VS Code or a GitHub page. And then, why not totally capture everyone into developing just with GitHub Codespaces?
This is exactly what well-intentioned folk like to see: it's solving everyone's problems! Batteries included, nothing else is needed! Why use your own machine or software that doesn't ping into a telemetry hell-hole of data collection on a regular basis?
Seems as a good idea to pitch git-appraise https://github.com/google/git-appraise
I'm not part of the project at all, but this is the only offline code review system I've found.
> GitHub has been useful to store all repositories of the Dillo project, as well as to run the CI workflows for platforms in which I don't have a machine available (like Windows, Mac OS or some BSDs).
The post does not mention CI anywhere else, are they doing anything with it, keeping it on GitHub, or getting rid of it?
> Furthermore, the web frontend doesn't require JS, so I can use it from Dillo (I modified cgit CSS slightly to work well on Dillo).
That sounds like a bad approach to developing a Web browser, surely it would be better to make Dillo correctly work with the default cgit CSS (which is used by countless projects)?
No doubt this is desirable. However, adding all the CSS features required to support cgit may have been a lot more work than editing cgit's CSS. It's an attempt at avoiding yak shaving; adding recursive sub-projects that balloon a project's scope of work far beyond the original plan.
Dillo is actively developed, and the project of "migrate away from github" is complete, so now other work can be started and completed (like adding the CSS features required to support mainline cgit).
> The post does not mention CI anywhere else, are they doing anything with it, keeping it on GitHub, or getting rid of it?
Yes, we have our own CI service. It is not public for now.
This was the part that mystified me. Love it or hate it, GitHub Actions is free. Alternative providers like Codeberg have much tighter limits on it, and it sounds unlikely the author's solution includes CI at all.
>frontend barely works without JavaScript, ... In the past, it used to gracefully degrade without enforcing JavaScript, but now it doesn't.
And the github frontend developers are aware of these accessibility problems (via the forums and bug reports). They just don't care anymore. They just want to make the site appear to work at first glance which is why index pages are actual text in html but nothing else is.
I'd love to hear the inside story of GitHub's migration of their core product features to React.
It clearly represents a pretty seismic cultural change within the company. GitHub was my go-to example of a sophisticated application that loaded fast and didn't require JavaScript for well over a decade.
The new React stuff is sluggish even on a crazy fast computer.
My guess is that the "old guard" who made the original technical decisions all left, and since it's been almost impossible to hire a frontend engineer since ~2020 or so that wasn't a JavaScript/React-first developer the weight of industry fashion became too much to resist.
But maybe I'm wrong and they made a technical decision to go all-in on heavy JavaScript features that was reasoned out by GitHub veterans and accompanied by rock solid technical justification.
GitHub have been very transparent about their internal technical decisions in the past. I'd love to see them write about this transition.
In answer to my own question about in-depth decision making, I just found this presentation from February 2025 by seven-year GitHub veteran Joel Hawksley: https://hawksley.org/2025/02/10/lessons-from-5-years-of-ui-a...
Relevant quote:
> But beyond accessibility and availability, there is also a growing expectation of GitHub being more app-like.
> The first case of this was when we rebuilt GitHub projects. Customers were asking for features well beyond our existing feature set. More broadly, we are seeing other companies in our space innovate with more app-like experiences.
> Which has led us to adoption React. While we don’t have plans to rewrite GitHub in React, we are building most new experiences in React, especially when they are app-like.
> We made this decision a couple of years ago, and since then we’ve added about 250 React routes that serve about half of the average pages used by a given user in a week.
It then goes on to talk about how mobile is the new baseline and GitHub needed to build interfaces that felt more like mobile apps.
(Personally I think JavaScript-heavy React code is a disaster on mobile since it's so slow to load on the median (Android) device. I guess GitHub's core audience are more likely to have powerful phones?)
For contrast, gitea/forgejo use as little JavaScript as possible, and have been busy removing frontend libraries over the past year or so. For example, jquery was removed in favor of native ES6+.
Let them choke on their "app-like experience", and if you can afford it, switch over to either one. I cannot recommend it enough after using it "in production" daily for more than five years.
I honestly believe that the people involved likely already wanted to move over to React/SPAs for one reason or another, and were mostly just searching for excuses to do so - hence these kind of vague and seemingly disproportional reasons. Mobile over desktop? Whatever app-like means over performance?
Non-technical incentives steering technical decisions is more common than we'd perhaps like to admit.
What's nuts about that presentation is that the github frontend has gone from ~.2 to >2 Million lines of code in the last 5-6 years. 10x the code... to get slower?
That also means a much larger team and great possibilities for good perf reviews, so basically an excellent outcome in a corporate env. People follow incentives.
github is a tool used where code is written: on desktop computers
no-one cares about the github mobile experience
microsoft making the windows 8 mistake all over again
I interact with GitHub on my mobile phone every day.
yeah and I bet three people used Windows 8 on tablets too.
I think you are wildly underestimating how common it is for people to use GitHub from a phone.
It's where I interact with notifications about new issues and PRs for one thing. I doubt I'm alone there.
I think you're very much in the minority, but I guess we can't really know.
Who has ever used github on mobile?
I'd like to see their logs about this.
Me, every day.
And what do you achieve by doing that?
Seems a small audience to optimise for.
I file issues, comment on issues, review PRs and increasingly ship code entirely from my phone (thanks to LLM assistance).
All of six of these commits today were created and shipped from my phone while I was out and about on a nice dog walk: https://github.com/simonw/tools/commits/47b07010e3459adb23e1... - now deployed to https://tools.simonwillison.net
[flagged]
Yeah, it's really sad to be able to walk the dog for an hour a day, check out the local pelicans and simultaneously hack on fun projects on my phone.
Saying your life is about filling PRs on you phone while walking your dog is not the flex you think it is.
These are PRs against my own personal projects. I enjoy hobbies.
Criticizing other people's hobbies isn't the flex you think it is.
Neither one of those comments should have received replies. Just flag and move on.
Everybody enjoy their hobbies captain.
What does "app-like" even mean? It's a website, not an app. Don't they have a native app for phones?
If it's fast people don't stick around for as long. Make it sluggish and you get more stonks analytics.
Having to enable javascript to see a website is not an accessibility problem according to WCAG.
It is a very real accessibility problem if you're using Dillo, which does not support javascript.
it's also a real accessibility problem if you're trying to use sticks and rocks to access the internet
This is in the context of where that web browser is hosted, so it's quite relevant.
Why should you need JavaScript to render text and buttons? Were browsers unable to do this prior to the JavaScriptification of everything?
The same reason you need to use LLMs to code.
There's 'enabling javascript' and then there's 'requiring a javascript VM with bleeding edge features basically only found 3 browsers'.
To be fair, the developers might care, but upper management certainly doesn't, and they're the ones who decide if those developers make their rent this month.
https://github.com/orgs/community/discussions/62372#discussi...
It's 1 step forward 2 steps back with this "server side rendering" framing of the issue and in practice observing Microsoft Github's behaviors. They'll temporarily enable text on the web pages of the site in response to accessibility issues then a few months later remove it on that type of page and even more others. As that thread and others I've participated in show this is a losing battle. Microsoft Github will be javascript application only in the end. Human people should consider moving their personal projects accordingly. For work, well one often has to do very distasteful and unethical things for money. And github is where the money is.
Fixing accessibility problems won't make shareholders happy while forcing AI down our throats will.
I didn't know about forgejo, it looks pretty nice.
Forgejo is what codeberg runs on, which imo is an awesome alternative to github
Although I'm not a fan of GH, I appreciate the ability to see how popular/valid some project is by looking at the number of stars (I know this is far from a perfect signal). I'm much less likely to try projects that have a low number of stars, or projects in different places.
God forbid we'd have to actually look at the code itself to figure out whether something is good.
GP explicitly stated that they use stars to determine the popularity of a repository, not the code quality.
One of the things Forgejo has been working on is federation, such that hopefully someday we can replicate the discoverability of GitHub without a central provider.
Nice! Finding new places to stick a Dillo into.
To me, this sounds like a good change. And FWIW, I an finding I am using dillo more and more these days.
I went to gitlab from github due to Microsoft changes, my needs are very simple so far gitlab seems OK.
I also mirror just the current source on sdf.org via gopher. If gitlab causes issues this could very well become my main site.
So, the first and foremost reason:
> the [GitHub] frontend barely works without JavaScript, so we cannot open issues, pull requests, source code or CI logs in Dillo itself, despite them being mostly plain HTML
because Dillo is a simple browser without a JS engine. And that is a perfectly valid reason to leave GitHub.
A good reason to move away from GitHub is it is from Microsoft (FAMAG; a company who kissed Trump's ring).
Sourcehut is hosted in The Netherlands, and Codeberg in Germany.
What would be nice is an aggregator site one could submit to and everyone just host it on their own internet connection, and nobody be dependent on a source for hosting their projects. Maybe something like bluesky with the AT protocol but with git repositories.
There’s Tangled[0], but I don’t have personal experience with it.
[0]: https://tangled.org/
Actually, they does seem to be almost exactly what I was thinking. Thanks.
Have a look to [Fossil](https://fossil-scm.org/) which is very easy to host and offer code repository, bug tracker, wiki, forum, etc. It is not Git however, but there is bridges and one can even mirror a Fossil repo to Github.
For just text there's Usenet, Freenet, Mastodon. Though these work for more than merely text.
I suppose something like this with git and source code exists on Tor.
During the Arab Spring and Hong Kong protests, Bluetooth was used to share messages whilst the internet was cut off.
There are ways around some of the issues there, such as using the GitHub API (I almost exclusively use the API), and/or using a user script (see below). Furthermore, on GitHub and on some other version control hosting services (such as GitLab), you can change "blob" to "raw" in the URL to access the raw files. However, as they say, it can be mirrored on multiple services (including self-hosting), and this would be a good idea, whether or not you use GitHub, so if you do not like GitHub then you do not have to use it.
Note that for some of the web pages on GitHub, the data is included as JSON data within the HTML file, although this schema is undocumented and sometimes changes. User scripts (which you might have to maintain due to these changes) can be used to display the data without any additional downloads from the server, and they can be much shorter and faster than GitHub's proprietary scripts.
Using a GPG key to sign the web page and releases is helpful (for the reasons they explain there), although there are some other things that might additionally help (if the conspiracy was not making it difficult to do these things with X.509 certificates in many ways).
>The most annoying problem is that the frontend barely works without JavaScript,
Not only did they spend years rewriting the frontend from Pjax to I think React? They also manage to lost customer because of it.
GitHub frontend is mostly still their own [1] Web Components based library. They use Turbo to do client side reloading. They have small islands of React based views like Projects view or reworked Pull Request review. The thing is, even if you disable JavaScript, sites still load sloow. Try it yourself. Frontend code doesn’t seem to be the bottleneck.
[1] https://github.blog/engineering/architecture-optimization/ho...
> it is a single point of failure. I don't mean that GitHub is stored in a single machine, but it is controlled by a single entity which can unilateraly ban our repository or account
Besides the usability problems with bloated MS Shitware, this is a strategic point that I see considered far too rarely for my taste.
> This is specially problematic when active issues with developer notes begin to be filled with comments from users that have never contributed to the project and usually do more harm than good. This situation ends up causing burnout in developers.
This is a huge with github and I would desperately want a github where issues are visible, but the ability to create or modify them is STRICTLY locked down to project members, and non members can only post Discussions for problems. I really prefer that I write the issue(s) based an "intake discussion" so to speak.
[flagged]
Dillo is a 25 year old project some of us know very well and use.
Obscurity is subjective. It might be obscure for you, and that's OK, but Dillo is not obscure for many people.
Dillo is very cool and the blog post says more than «I moved away from Github.»
It's actually quite interesting, I recommend to read it!
Obscure? Dillo has been alive more time than tons of HN users.
I hope you will continue maintaining a mirror in GH. Some tools like deepwiki are excellent resources to learn about a codebase when their is not much documentation going around. But these tools only support pulling from GH.
I have the exact opposite experience where I had to block multiple such "excellent resources" from my search results.
How is pulling dependent on github?
Git pulling isn't unique to github and it works over http or ssh?
A neat thing about GitHub is that every file on it can be accessed from URLs like https://raw.githubusercontent.com/simonw/llm-prices/refs/hea... which are served through a CDN with open CORS headers - which means any JavaScript application running anywhere can access them.
Demo: https://tools.simonwillison.net/cors-fetch?url=https%3A%2F%2...
That feature seems common to other git hosts / forges. For example, here's one of Dillo's files, from a few commits ago, from their cgit-based host
https://git.dillo-browser.org/dillo/plain/src/ui.cc?id=29a46...
That doesn't have open CORS headers: https://tools.simonwillison.net/cors-fetch?url=https%3A%2F%2...
It's also not being served via a caching CDN, which means I don't feel comfortable running anything automated against it as that might add load to the server that they aren't ready for.
It's less about pulling and more about tools like DeepWiki making the assumption that its inputs live in GitHub, so repository URLs are expected to be GH URLs as opposed to a URL to a git repository anywhere.
That being said, there's no reason for tools like it to have those constraints other than pushing users into an ecosystem they prefer (i.e. GitHub instead of other forges).