What is a good linux distribution whose 64 bit version is fully functional?
Asked by
Thammuz (
9287)
December 1st, 2010
I’m currently an Ubuntu user but I’ve grown tired of having to use the 32 bit version when i spent good money to have a 64 bit CPU.
Ever since i switched to Ubuntu i tried to install the amd64 version of the system and found three problems with it: no flash, my torrent client crashes and USB transfer rates suck harder than a black hole made of vacuum cleaners. While the first two are by no means Canonical’s fault, I’m inclined to think that if they wasted less time on pointless interface makeovers and more time on actual bugfixing, the last one would have been fixed at this point.
This whole thing is even more moronic when you consider what issue this is. It’d still be unforgivable if it was another kind of problem, but it’s even worse considering USB drives are by now the standard data transfer units and can reach terabytes of capacity.
The actual issue is, in fact, that the transfer rate slowly decreases even reaching under 1MB/s, which basically means I’m transferring locally at the speed i would ordinarily download from megaupload. Which is a problem that goes completely unnoticed as long as you copy small files, because the decline needs time to become noticeable. So you’ll never notice it until you try copying 2 GBs worth of data to a USB drive, right before leaving to go to school where a friend is waiting for you to give it to them.
I’ve been using Ubuntu for two years now and I’m tired of this bullshit. At least three stable distros, one of which a Long Term Support and this still isn’t fixed, forcing me to sacrifice my CPU’s performance on the altar of their laziness.
So, in short: What distro should i switch to? I’ve been using only Ubuntu but by now i think i have a good grasp of the mechanics behind it. All i really care about is for the system to behave similarly to Ubuntu on the surface, fixing things behind the scenes is pretty much find a tutorial and follow it, if you don’t have some higher level comprehension of system programming. For instance I’d love a service like aptitude, which handles the download and installation of packages without sending me on fetch-quests around the Internet, but i can do without it if it’s necessary.
Also: Can source code packages be compiled on 64 bit systems even if they’re not specifically made for them? All my attempts lasted too little time for me to find out.
Observing members:
0
Composing members:
0
13 Answers
I use Gentoo and it’s 64-bit. Mostly. I use a 32-bit version of Firefox for Flash, although I’m not sure I need to; I seem to remember a 64-bit version of Flash being available.
For torrents I use Transmission and never had a problem, other than not liking Transmission. And I don’t have a problem with USB transfers, either.
Gentoo is a sourced-based distro so if you’re used to downloading packages and having them ready to go you’ll be in for a shock while you wait for everything to compile. I’ve never noticed a distinction between 32-bit and 64-bit tarballs, though, so I’d say it’s likely that a large portion of source code will compile on a 64-bit processor with little or no changes (unless they changes are wrapped up in the source).
@mrentropy About compiling from source: what does it actually mean? I have a textbook comprehension of it, aka transforming from source code to an actual executable, but the odd thing i found is that sometimes, when you download a tarball (it’s not like I’ve never compiled, luckily), you find the scripts needed to compile, sometimes you find the whole thing ready to go, sometimes something in between.
Which is actually pissing me off somewhat because I’d love to be able to call the programs up from a terminal if need be (like making a desktop shortcut, for instance), and having them scattered all over my HDDs does not make for fun times in that respect.
@Thammuz That’s pretty much it. Putting the source code through a compiler. Some have utilities to make sure it will compile (i.e. ./config), some have batch files that do everything for you. Usually most use ‘make’ so all you have to type is ‘make’. If it’s ready to go then you aren’t compiling the source.
In Gentoo’s case, there’s a package manager that downloads the sources, unpacks it, and then does the compiling. For the most part all you would need to do is type ‘emerge mozilla-firefox’ and you’d get Firefox.
It’s definitely interesting seeing new linux users that have never compiled anything, or built their kernel
I’ve been using Ubuntu since 2004 and have had many many problems. I’ve been able to fix most of these, and most of the ones I couldn’t fix were due to flaky hardware (I have built most of my machines out of used and sometimes crap components.)
Currently I’m running 64-bit 10.04. No current problems with Flash or with USB transfers.
If you’re shopping for a new distro I recommend playing around with the livecd versions before installing. If it looks good you might consider running it in a VM for a while depending on your needs. The livecd won’t help you with Flash in most cases, since few of them can offer it pre-installed. Maybe Mint does, I’m not sure.
As for compiling software, I keep all my source code and binaries installed by me in my home directory, and require my users to do likewise.
@mrentropy Ok, but what’s with things like dolphin (the emulator, not the file manager for kde) that come in a .tar but actually are already compiled and ready to go? So far, every time i get one of those programs, it means that to call it up from the terminal (and therefore to make desktop shortcuts too) i have to cd my way to its folder or add the folder in full in the command line, which is unpractical not to mention hard considering my own retardation, while if the code actually needs compiling it’s all good because it also stuffs everything in the right folders (the executable in bin or sbin, mainly, which in Ubuntu means you can call it up from a command line) without needing me to amend the original code to make it work properly if the executable is in one folder and the data is somewhere else.
@phoebusg I resent that comment. I have compiled before, as i already said. What i don’t understand is why they say you can download the source and compile it when in reality there’s nothing to compile sometimes. I never built my kernel, that i have to admit, but that aside I’ve been doing my fucking homework. The fact that I’d rather not compile, if i can, is only because I find it a very boring trial and error process, because half the times the actual dependencies are not listed on the readme/original site and you end up having to try to compile about ten times to get all the packages that the compiler says are missing before getting the final product.
@koanhead how would one go about doing something like that? how do you set up a different default folder for executables that is also used by make (or the other package managers) as default?
@Thammuz – I don’t really understand your question. I untar source tarballs into ~/Projects, cd into the untarred directory, and first check for a README file which I expect to inform me of dependencies to the build and deviations in the build process from the usual ”./configure – make – make install”.
After following the build instructions I have a bunch of new files in the directory, including a functioning binary. Sometimes the “make install” script will put the files elsewhere, but this should be obvious from make’s output, and further should be clearly explained in the README.
Sometimes the process fails, and instead of a functioning binary I get a mess. Usually this is accompanied by errors in the make output. Usually there’s a make target called “make clean” or “make uninstall” which will remove the mess, restoring the tree to its installed state so you can try again. Again, the README should mention that this is so.
In general I don’t attempt to compile software that has not README or an incomplete one (or a similar file of obvious function.) Fairly or not, I do judge software by its documentation.
@Thammuz – also, while make can conceivably be used as a package manager of sorts, it generally isn’t and therefore does not have or need its own “default folder”. The FHS specifies certain folders for user binaries- /usr/share is a popular choice. Installing programs into your home folder means you are the only one who can use them (and you can fill a disk quota quickly that way) but on a single-user system this is no big deal.
Normally I just cd into the directory where the binaries are located when I want to execute them, but it’s easy enough to add it to your $PATH.
@koanhead Sometimes the “make install” script will put the files elsewhere, but this should be obvious from make’s output, and further should be clearly explained in the README.
Ok, this far it’s obvious. The problem i imagine would appear is if i decide to move the binaries to another folder afterwards it’s probably going to not work, at least that’s what happened when i tried it with dolphin (which was already compiled in the tar file, oddly enough). Am i wrong in assuming that?
Normally I just cd into the directory where the binaries are located when I want to execute them, but it’s easy enough to add it to your $PATH.
Exactly what i need to know. I think I can google this one from here.
Try Linux Mint. It is ubuntu based, but has a ton of tweaks, so it may solve all your issues and be more user-friendly to boot. If that doesn’t work, I’d try Fedora, then maybe Zenwalk.
When you switch off ubuntu, you loose all those quick and easy to find packages in the Ubuntu Software Center. Fedora and Zenwalk obviously have similar software managers, but I keep coming back to Ubuntu for the wealth of PPAs that make installing just about anything a breeze, since so many programs have PPAs if they aren’t in the main repository.
How do you find a PPA? Search PROGRAM-NAME PPA on google.
How do you use a PPA? run
apt-add-repository PPA-LINK
You find that ppa link inside one of the pages from your google search. It should be the big bold link, usually on launchpad.net slash something
@jediknight304 – Does Mint have a livecd he can test, and if so does it have the flash plugin preinstalled? Also, if you’ve used it, I’m interested in your impressions- feel free to PM me if you don’t want to ‘clog’ this thread (assuming fluther supports PMs, I’m a fluther noob so there’s lots I don’t know about it!)
@koanhead @jediknight304 I’ve tried mint, it’s pretty much ubuntu, only more windows-like and with a lot of small changes, as @jediknight304 already said. I have never tried the 64 bit version though. I might give it a shot. (Yes it does have a live cd and yes fluther supports PMs)
Update: i tried installing the 64 bit version of ubuntu 10.10 out of good will towards canonical and, guess what, nothing changed.
Although I’ve found (via google) that it’s likely to be a problem of the linux kernel itself, stretching from all the way back to 2006, which would be a disaster from my POV, because it would mean 32-bit or fuck off (Read: Fuck off. i’m patient and all but if i have to sacrifice performance at least on windows i can play).
The odd part is that, apparently, back then it was a problem in the 32 bit version too.
I’m going to try installing the 64bit version of debian shortly, failing that either fedora, gentoo or archlinux (probably gentoo).
Answer this question
This question is in the General Section. Responses must be helpful and on-topic.