... So will Mac get the new dolphin build as well so I can play with my PC friends, or will it continue to get the shaft?
tl;dr answer: It's complicated.
Well, Apple expects you to run their Mac OS X software on an Apple computer, and I don't own an Apple computer. My only option is to do the dubious work of creating a "hackintosh" installation in order to run Mac OS X on a non-Apple computer. It's of questionable legality, as a violation of the EULA, and a potentially dangerous operation in terms of potential data loss.
As you might expect, this is a
pain in the ***. A serious pain in the ***.
...
Now, for Linux builds, that's all doable, but deployment on Linux is very fragile. On Linux, the base system libraries are not guaranteed for a given version. You can run Linux 3.6 with:
- Any reasonably modern version of glibc, and there is no backwards compatibility between versions.
- Another libc, like bionic (Android) or musl, with another entirely different userspace
- Literally any package manager, including apt/dpkg, yum/rpm, or no package manager at all.
- And pretty much any set of binary loaders. It's almost all elf, but nothing at all stops someone from using an exotic format.
Therefore, you don't really get "Linux builds." You get "Ubuntu" builds, or you get "Compatible with (most) common configuration" builds, or something to that effect.
But it's more complicated, because of package managers. Linux ain't like Windows, where you can just zip up an executable and some files and people can use it portably. Well, you CAN do this on Linux, but that doesn't mean it's a good idea. Most file managers set the cwd to $HOME when you do this, causing it to look for files in $HOME instead of the directory of the executable. Why? Because it doesn't attempt to facilitate "portable" executables, so there is always a bit of fuss involved in making them work right. You can do it, but people will not like it, it will be miserably hackish, and it's not a good long term solution.
So, you have to create packages. And packages, have to contain a lot of information. Like, dependencies - which is a good thing, don't get me wrong, but it can be complicated to deal with. And menu entries - the binary might be in the path, but that doesn't mean the user knows how to get to it. Most packages are expected to install menu entries. And there's more, like following conventions of distros, that is sometimes enforced by the package manager.
For Debian, you almost always want a
repository for builds. Setting up this repository itself is not simple work. End users have to accept your key. How? Well, that's also not laid out. You can have a package they install that does the work for them, or give them a command to run to do it. There's little consensus, although the package method has been gaining popularity.
...
So you might be wondering why I typed up all of this crap just to say "It's really complicated." It's because I don't want people to feel like they're getting "shafted." I don't want to be releasing Windows-only builds, but most people are running Windows, and there's time investment required for other platforms. I feel it is necessary to explain exactly what goes into producing these packages. Linux being as flexible as it is is great for developers but bad for software deployment. Mac OS X being well-integrated with its hardware is great for end-users but bad for software deployment. Windows has the advantage of extreme backwards compatibility and stability across computers, and ubiquity (you can run Windows on a Mac, and unless you're running an exotic architecture, it's doubtful you can't run Windows on a computer running Linux.)
All of that being said, I think the next goal is to indeed begin work on getting builds for all platforms. With time, this should come. In the meantime, power users shouldn't have too much trouble manually building.